Convert between Unix timestamps and human-readable dates. Current time updates in real-time.
A Unix timestamp (also called Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970 00:00:00 UTC. It is a universal way to represent time in computing, independent of time zones. Most programming languages and databases support Unix timestamps natively.
| Timestamp | Date | Significance |
|---|---|---|
| 0 | Jan 1, 1970 | Unix Epoch (the beginning) |
| 1000000000 | Sep 9, 2001 | First 10-digit timestamp |
| 1234567890 | Feb 13, 2009 | Sequential timestamp celebration |
| 2147483647 | Jan 19, 2038 | Y2K38 — 32-bit overflow |
Most 32-bit systems store timestamps as a 32-bit signed integer, which overflows on January 19, 2038 at 03:14:07 UTC. After this, timestamps would wrap around to negative values, potentially causing system failures similar to the Y2K bug. 64-bit systems don't have this problem.
Unix timestamps traditionally use seconds. JavaScript Date objects use milliseconds. Most APIs specify which format they use — check the documentation. If the number has 13 digits, it's probably milliseconds; 10 digits means seconds.