Convert between timestamp and date/time (seconds, milliseconds, microseconds)
A Unix timestamp (also known as Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970 00:00:00 UTC, not counting leap seconds. This point in time is called the "Unix Epoch".
Timestamps are timezone-independent and widely used in:
| Unit | Precision | Digits | Common Usage |
|---|---|---|---|
| Seconds (s) | 1 second | 10 digits | PHP, MySQL UNIX_TIMESTAMP() |
| Milliseconds (ms) | 0.001 second | 13 digits | JavaScript, Java |
| Microseconds (μs) | 0.000001 second | 16 digits | High-precision timing, benchmarks |
A: January 1, 1970 is when the Unix operating system was born, and it was chosen as the starting point for computer time. This date is called the "Unix Epoch".
A: 32-bit systems use a signed 32-bit integer to store timestamps, with a maximum value of 2147483647, which corresponds to January 19, 2038 at 03:14:07 UTC. After this time, it will overflow. Modern 64-bit systems have resolved this issue.
A: No, timestamps are timezone-independent as they represent UTC time. The timezone conversion happens only when displaying the time to users.
A: JavaScript's Date.now() and new Date().getTime() return milliseconds (13 digits). Divide by 1000 to get seconds.