Convert Unix timestamps to human-readable dates and vice versa.
A Unix timestamp — also called epoch time or POSIX time — is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC, ignoring leap seconds. It is widely used in programming, databases, APIs, and log files as a compact, timezone-independent way to represent a point in time. This Unix timestamp converter and epoch time converter handles both second-precision (10-digit) and millisecond-precision (13-digit) timestamps automatically, and converts in both directions.
Enter a numeric timestamp in the Timestamp to Date section to convert it to a human-readable date in UTC, your local timezone, ISO 8601 format, and a relative time (e.g., "2 hours ago"). The tool auto-detects whether your input is in seconds or milliseconds. Use the Date to Timestamp section to pick a date and time and get the corresponding Unix timestamp in both seconds and milliseconds. The current Unix timestamp is displayed at the top and updates every second.
The offset calculator generates a Unix timestamp relative to now or a specified time — useful for testing time-based logic, building expiration timestamps for tokens or cache entries, or calculating "X hours ago" / "X days from now". Quick chips cover the common cases (1h ago, 6h ago, 1d ago, 7d ago, 30d ago) and a custom row supports minutes, hours, days, and months in either direction.
Developers use Unix timestamps and epoch time conversion when reading database created_at / updated_at columns, debugging server logs, decoding JWT tokens (iat/exp claims), inspecting API responses, scheduling cron jobs, and calculating durations. Common conversions include timestamp to date for human-readable log analysis, date to timestamp for building API queries, and millisecond-to-second conversions when bridging JavaScript Date.now() with backend services.
A Unix timestamp (also called epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC, ignoring leap seconds. It is a compact, timezone-independent way to represent a point in time and is widely used in databases, log files, APIs, and programming languages. Modern systems often use millisecond-precision timestamps (13 digits) instead of seconds (10 digits).
Paste your Unix timestamp into the Timestamp to Date field and click Convert. The tool returns the date in UTC, your local timezone, ISO 8601 format, and a relative time (e.g., "2 hours ago"). It auto-detects whether your input is in seconds (10 digits) or milliseconds (13 digits) — no need to specify the unit.
A second-precision Unix timestamp has 10 digits (e.g., 1700000000) — used in C/C++, PHP, MySQL, and most older systems. A millisecond-precision timestamp has 13 digits (e.g., 1700000000000) — used by JavaScript's Date.now(), Java, and many modern APIs. To convert milliseconds to seconds, divide by 1000. This converter auto-detects both formats.
Yes. All conversions happen entirely in your browser using JavaScript's native Date API. Nothing is sent to any server. The page works offline once loaded — safe for sensitive timestamps from logs, databases, or production data.
A modern code editor with live preview, built for web developers and designers.
Try Phoenix Code Free