Unix Timestamp Converter

Convert epoch seconds or milliseconds to human-readable date/time and back. Shows UTC, local browser time, IST (Asia/Kolkata), and a configurable timezone.

Current Unix Timestamp

1778701657

Milliseconds: 1778701657000

Timestamp → Date

Enter a timestamp above.

Date → Timestamp

Pick a date and time above.

Runs entirely in your browser — nothing is uploaded or logged.

How Unix timestamps work

A Unix timestamp is the number of seconds (or milliseconds) that have elapsed since the Unix epoch: 1970-01-01 00:00:00 UTC. Leap seconds are not counted, making it a perfectly uniform, monotonic counter — unlike wall clocks which occasionally jump or repeat.

// JavaScript
Date.now()               // → milliseconds since epoch  (13 digits)
Math.floor(Date.now()/1000) // → seconds since epoch  (10 digits)
new Date(1700000000 * 1000).toISOString()
// → "2023-11-14T22:13:20.000Z"

// Python
import time
int(time.time())         # → seconds
time.time() * 1000       # → milliseconds

This tool uses the browser's built-in Intl.DateTimeFormat API with the timeZone option to render any timestamp in any IANA timezone without external libraries. The Intl.RelativeTimeFormatAPI produces the “3 hours ago” / “in 2 days” strings.

Frequently asked questions

Seconds or milliseconds — how do I tell?

Unix timestamps before ~year 2286 in seconds are 10 digits; in milliseconds they are 13 digits. Most server APIs and databases use seconds; JavaScript's Date.now() uses milliseconds. If unsure, paste 1700000000 — if it converts to November 2023, it's seconds; if the year shows 1970, you passed milliseconds as seconds.

What is Unix time?

Unix time (also called epoch time or POSIX time) counts the number of seconds elapsed since 1970-01-01 00:00:00 UTC, not counting leap seconds. It is a uniform, monotonic count used in databases, log files, HTTP headers, JWTs, and virtually every API that needs to represent a point in time.

Timezone confusion — why does the same timestamp show different dates?

Unix timestamps are inherently UTC — they represent a single, unambiguous instant in time. The displayed date and time change by timezone only because formatting applies a local offset. The timestamp 1700000000 is the same moment everywhere on Earth; only its human-readable representation differs.

Why does the year 2038 matter?

Systems storing Unix time as a signed 32-bit integer overflow on 2038-01-19 at 03:14:07 UTC, wrapping to negative values. This tool uses 64-bit JavaScript Numbers which can represent timestamps up to roughly year 285,000 without overflow — but legacy embedded systems, databases, and old C code using int32_t still face the problem.

Powered by Pyrelo

The complete work dashboard for small teams

Developer tools, finance calculators, and business utilities — all in one flat-priced dashboard.

See Pyrelo Dashboard

More Data Tools