WebToolsPlanet
converter Tools

Unix Timestamp Converter

Convert between Unix timestamps and human-readable dates across any timezone — with millisecond precision and real-time epoch display.

Last updated: March 25, 2026

Used 44K+ times
Client-Side Processing
Input Data Stays on Device
Instant Local Execution

What users say

The JWT "exp" claim example is exactly what I needed — I debug token expiry issues weekly. The timezone-aware conversion is far more useful than a simple epoch calculator.
Alex B.Full-Stack Developer
The Y2038 problem FAQ was a pleasant surprise. Shows this was written by someone who actually understands Unix internals.
Preet S.Backend Engineer

Find this tool useful? Support the project to keep it free!

Buy me a coffee

What is Unix Timestamp Converter?

The Unix timestamp (also called Epoch time or POSIX time) is a system for describing points in time as a single integer: the number of seconds elapsed since the Unix Epoch — January 1, 1970, at 00:00:00 UTC. This convention was codified by Ken Thompson and Dennis Ritchie when designing Unix in the late 1960s and has become the universal time representation in computing: every major database, operating system, programming language runtime, cloud API, and logging system stores and transmits time as a Unix timestamp.

The format comes in two common variants: **seconds** (10-digit, e.g., `1706745600`) and **milliseconds** (13-digit, e.g., `1706745600000`). JavaScript's `Date.now()` returns milliseconds; Linux `date +%s` returns seconds; Python's `time.time()` returns seconds with fractional milliseconds. This tool converts freely between both variants and any human-readable date representation, with timezone support for converting to a specific local time.

How to Use Unix Timestamp Converter

1

View the live current Unix timestamp (seconds and milliseconds) updating every second in the top panel

2

Enter a Unix timestamp (10-digit seconds or 13-digit milliseconds) in the "Timestamp → Date" field to see the human date

3

Enter a date and time in the "Date → Timestamp" fields to convert it to a Unix timestamp

4

Select a target timezone from the dropdown to see the local time representation in any timezone

5

Click any "Copy" button to copy the timestamp or date string to your clipboard

Common Use Cases

  • Debugging an API response that returns a timestamp like "expires_at": 1706745600 — convert to verify the actual expiry date
  • Converting database timestamp columns (MySQL UNIX_TIMESTAMP, PostgreSQL EXTRACT(EPOCH)) to readable dates
  • Analyzing log files showing epoch timestamps — converting each to a human time for incident investigation
  • Scheduling a cron job or cloud function to run at a specific Unix time code
  • Understanding JWT token expiry (the "exp" claim is always a Unix timestamp)
  • Verifying GitHub Actions workflow timestamps or AWS CloudWatch log timestamps
  • Building date ranges for API queries that accept epoch time parameters (Slack API, Stripe API)
  • Converting "Last-Modified" HTTP header times from HTTP date format to epoch for cache logic

Example Input and Output

Converting a JWT "exp" claim timestamp to a readable expiry date:

JWT payload (decoded)
{
  "iat": 1706659200,
  "exp": 1706745600,
  "sub": "user_42"
}
Human-readable dates
iat (issued at):  1706659200
→ Thu Jan 31 2024 00:00:00 UTC
→ Wed Jan 30 2024 18:00:00 CST (UTC-6)

exp (expires):    1706745600
→ Thu Feb 1 2024 00:00:00 UTC
→ Wed Jan 31 2024 18:00:00 CST (UTC-6)

Token lifetime: 86,400 seconds = exactly 24 hours

Client-Side Processing

All timestamp conversions run locally in your browser using the native JavaScript Date API. No timestamps are sent to our servers.

Milliseconds in JavaScript

JavaScript always works in milliseconds internally. To convert: seconds → new Date(epochSeconds * 1000). Milliseconds → new Date(epochMs). To get seconds: Math.floor(Date.now() / 1000). The Temporal API (Stage 3 proposal) will standardize this: Temporal.Now.instant().epochSeconds gives seconds directly.

ISO 8601 vs Unix Timestamps

ISO 8601 strings ("2024-02-01T00:00:00Z") are human-readable but harder to sort, compare, and do arithmetic with. Unix timestamps are trivial to subtract (endTime - startTime = duration in seconds), compare (timestamp1 > timestamp2 = chronological order), and sort numerically. Use ISO 8601 for display to humans; use Unix timestamps for storage, comparison, and computation.

Frequently Asked Questions

What is the Unix Epoch and why January 1, 1970?
The Unix Epoch is January 1, 1970, at 00:00:00 UTC — the zero point from which all Unix timestamps count. The 1970 date was chosen when Bell Labs engineers designed Unix in the late 1960s. They needed a reference point recent enough that timestamps would fit in 32-bit integers for decades. It was a practical engineering choice rather than a historically meaningful date. Earlier systems (IBM OS/360) used 1964; Apple's classic Mac OS used January 1, 1904.
How do I tell if a timestamp is in seconds or milliseconds?
Count the digits: a 10-digit number (e.g., 1706745600) is seconds. A 13-digit number (e.g., 1706745600000) is milliseconds. As of 2024, Unix timestamps in seconds have 10 digits. You can also divide by 1000: if the result is a year in the 1970–2100 range, the original was milliseconds; if the result seems absurd (like year 57,000), the original was already in seconds.
Are Unix timestamps affected by timezones?
No. A Unix timestamp is always an absolute number of seconds since the UTC epoch — it is completely timezone-independent. 1706745600 always refers to the same instant in time globally. The timezone only affects how that moment is displayed as a calendar date/time (e.g., "2024-02-01 00:00 UTC" and "2024-01-31 18:00 CST" are the same timestamp). This is why databases store timestamps as Unix epochs rather than datetime strings.
What is the Year 2038 problem?
The Year 2038 problem (Y2K38) affects systems that store Unix timestamps as a 32-bit signed integer. A 32-bit signed integer can hold values up to 2,147,483,647 — which equals January 19, 2038 at 03:14:07 UTC. After that, the counter overflows to negative, causing dates to wrap to December 13, 1901. Modern systems use 64-bit integers, which extend the range to the year 292,277,026,596 — effectively infinity for practical purposes.
How do I get the current Unix timestamp in code?
By language: JavaScript: Math.floor(Date.now() / 1000) (seconds) or Date.now() (milliseconds). Python: import time; int(time.time()). Go: time.Now().Unix(). Ruby: Time.now.to_i. Java: Instant.now().getEpochSecond(). Bash/Linux: date +%s. PostgreSQL: EXTRACT(EPOCH FROM NOW()). MySQL: UNIX_TIMESTAMP(). All return the current UTC seconds since epoch.
What is the maximum valid Unix timestamp?
On 64-bit systems: the maximum is 9,223,372,036,854,775,807 seconds (approximately year 292 billion) — for practical purposes, unlimited. On 32-bit systems: maximum is 2,147,483,647 (January 19, 2038). For milliseconds: JavaScript's Date object accurately handles timestamps up to ±8,640,000,000,000,000ms (April 20, 271,821 CE). In practice, all modern production systems use 64-bit timestamps.

How This Tool Works

The live counter uses setInterval(1000) calling Date.now() each second. For timestamp-to-date conversion, new Date(epochSeconds * 1000) creates a Date object from seconds input. For date-to-timestamp, the Date constructor parses the input string and .getTime() / 1000 returns the epoch in seconds. Timezone display uses Intl.DateTimeFormat with the selected IANA timezone identifier for localized time rendering.

Technical Stack

JavaScript Date APIIntl.DateTimeFormatIANA timezone databaseClient-side only