Convert Unix timestamps to dates instantly
Convert timestamps to dates and back instantly.
Unix to date or date to Unix timestamp.
See your date in various formats.
See current Unix timestamp updating live.
All conversions happen in your browser.
No limits, no signup required.
A Unix timestamp (also called POSIX time or Epoch time) is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC. It's a universal way to represent time in computing, independent of time zones.
Epoch time is another name for Unix timestamp. The "epoch" refers to the starting point: January 1, 1970, 00:00:00 UTC - the moment from which all Unix time is calculated.
Epoch and Unix timestamp refer to the same thing: time measured as seconds since January 1, 1970 UTC. The terms are interchangeable. "Epoch" emphasizes the starting point; "Unix timestamp" emphasizes the format.
A Unix timestamp is a plain number like 1704067200 (for Jan 1, 2024). 10 digits = seconds since 1970. 13 digits = milliseconds. It's just a count of time units, with no formatting or separators.
A Unix timestamp means "this many seconds have passed since midnight UTC on January 1, 1970." For example, timestamp 86400 means exactly 1 day (24×60×60 seconds) after the epoch.
Unix timestamp format is a single integer representing seconds since epoch. No separators, no timezone info - just a number. Example: 1609459200 represents Jan 1, 2021 00:00:00 UTC.
Current timestamp: check the live counter above! Some examples: 0 = Jan 1, 1970. 1000000000 = Sep 9, 2001. 1609459200 = Jan 1, 2021. 2000000000 = May 18, 2033.
Unix timestamps count seconds continuously from January 1, 1970 UTC. Every second that passes adds 1 to the count. This creates a universal, timezone-independent way to represent any moment in time as a single number.
Unix timestamps in seconds are currently 10 digits (since Sept 2001). In milliseconds, they're 13 digits. Before 2001, timestamps were 9 digits. They'll become 11 digits in 2286.
A Unix timestamp is 10 digits for seconds (current era) or 13 digits for milliseconds. In bytes, a 32-bit timestamp uses 4 bytes; a 64-bit timestamp uses 8 bytes.
Yes! Unix timestamps are always in UTC (Coordinated Universal Time). They represent absolute moments in time, independent of any local time zone. When you convert to a local date, your browser applies the local time zone offset.
Yes, always. Unix timestamps have no timezone concept - they're just a count of seconds from a UTC reference point. This makes them perfect for storing times that need to work globally.
Yes! Timestamps represent absolute time, not local time. The same timestamp means the same instant worldwide. Timezone is only relevant when displaying the timestamp as a human-readable date.
No, timestamps are timezone-free. A timestamp is the same number everywhere in the world. Timezones only come into play when you convert that number to a local date/time for display.
The traditional Unix timestamp is in seconds. However, JavaScript and some systems use milliseconds (1000x larger). Our converter handles both - if your number is 13 digits, it's likely milliseconds; 10 digits means seconds.
Traditional Unix timestamps are in seconds. Some systems (like JavaScript's Date.now()) use milliseconds for more precision. Our converter auto-detects: 10 digits = seconds, 13 digits = milliseconds.
Enter your Unix timestamp in the "Timestamp → Date" section and click Convert. The tool instantly shows the human-readable date in multiple formats including ISO 8601, UTC, and your local time.
Use the date and time pickers in the "Date → Timestamp" section, then click Convert. You'll get the Unix timestamp in seconds, which you can copy with one click.
Enter your timestamp above and click Convert. We show the datetime in multiple formats: ISO 8601 (2024-01-15T12:30:00Z), RFC 2822, your local time, and more.
Epoch and Unix timestamp are the same thing. Enter the epoch number in our converter, and you'll see the corresponding date in various human-readable formats instantly.
Enter your epoch timestamp above to convert it to the current time format. The result shows both UTC time and your local time, accounting for your timezone automatically.
To calculate: count seconds from Jan 1, 1970 UTC to your target date. Or simply use our Date → Timestamp converter - enter any date and we calculate the timestamp for you.
A Unix timestamp is the number of seconds since Jan 1, 1970. Larger numbers = later dates. Use our converter to translate any timestamp to a readable date format instantly.
The maximum 32-bit Unix timestamp is 2,147,483,647 (January 19, 2038, 03:14:07 UTC). This is the "Year 2038 problem." 64-bit systems support timestamps far into the future.
In JavaScript: Math.floor(Date.now() / 1000). In Python: import time; time.time(). In PHP: time(). Our converter shows the current timestamp live at the top of the page.
Traditional Unix timestamps use 32-bit signed integers, limiting them to dates before 2038. Modern systems use 64-bit timestamps for dates far into the future. Our converter handles both.
Historically, int (32-bit). Modern systems use long (64-bit) to avoid the Y2038 problem. In JavaScript, timestamps are stored as 64-bit floats. Always use 64-bit when possible.
A 32-bit timestamp uses 4 bytes. A 64-bit timestamp uses 8 bytes. JavaScript's millisecond timestamps use 8 bytes (64-bit float). Most modern systems use 64-bit for future-proofing.
Yes! Our Unix timestamp converter is 100% free with no limits. Convert as many timestamps as you need - no signup required, and all conversions happen in your browser.