TL;DR: A Unix timestamp is the number of seconds since January 1, 1970. The #1 gotcha is mixing up seconds (10 digits) and milliseconds (13 digits). JavaScript uses milliseconds, most everything else uses seconds. Always check which one you're dealing with.
What Even Is a Unix Timestamp?
You know how we humans say "March 8, 2026 at 3:00 PM Eastern"? That date format is great for us, but computers find it annoying. Which timezone? What about daylight saving? Is "March" the same in every language?
So back in the early days of Unix (the 1970s), someone had a beautifully simple idea: just count the seconds since a fixed starting point. That starting point is January 1, 1970, at midnight UTC — called the "Unix epoch." Every moment in time is just a number.
0 = January 1, 1970 00:00:00 UTC (the beginning of time, apparently)
1000000000 = September 9, 2001 (the "billennium" party)
1774224000 = March 20, 2026 00:00:00 UTC
2147483647 = January 19, 2038 (uh oh, more on this later)
Why is this useful? Because timestamps are timezone-independent, easy to compare (bigger number = later time), and doing time math is just addition and subtraction. Want to know what time it'll be in 24 hours? Just add 86,400.
The #1 Bug: Seconds vs. Milliseconds
This is the mistake that has wasted more developer hours than I can count. Different systems use different precisions:
- Seconds (10 digits) — Unix/POSIX standard. APIs, databases, most backend languages. Example:
1774224000 - Milliseconds (13 digits) — JavaScript's
Date.now(), Java. Example:1774224000000 - Microseconds (16 digits) — Some databases. Example:
1774224000000000 - Nanoseconds (19 digits) — Go's
time.Now().UnixNano(). Example:1774224000000000000
Quick trick: count the digits. Right now, seconds timestamps are 10 digits, milliseconds are 13.
Dev Joke: A JavaScript developer and a Python developer walk into a bar. The JavaScript dev says "the time is 1774224000000." The Python dev says "that's the year 58200." They were both right — one was thinking milliseconds, the other seconds.
// THE CLASSIC BUG
new Date(1774224000)
// January 21, 1970 — WRONG! JavaScript wants milliseconds!
new Date(1774224000 * 1000)
// March 20, 2026 — Correct!
# The reverse in Python (expects seconds)
import datetime
datetime.datetime.fromtimestamp(1774224000000)
# Year 58200 — WRONG! Python wants seconds!
datetime.datetime.fromtimestamp(1774224000)
# 2026-03-20 — Correct!
Rule of thumb: If you get a date in 1970 when you expected this year, you forgot to multiply by 1000. If you get a date in the year 50000+, you forgot to divide by 1000.
Converting in JavaScript
// Get the current timestamp
const nowMs = Date.now(); // milliseconds
const nowSec = Math.floor(nowMs / 1000); // seconds
// Timestamp to human-readable date
const date = new Date(1774224000 * 1000);
date.toISOString(); // "2026-03-20T00:00:00.000Z"
date.toLocaleString(); // Localized to user's timezone
// Human-readable date to timestamp
Math.floor(new Date("2026-03-20").getTime() / 1000);
// 1774224000
Converting in Python
import datetime, time
# Current timestamp
now = int(time.time()) # seconds
# Timestamp to date (UTC)
dt = datetime.datetime.fromtimestamp(1774224000, tz=datetime.timezone.utc)
print(dt) # 2026-03-20 00:00:00+00:00
# Date to timestamp
dt = datetime.datetime(2026, 3, 20, tzinfo=datetime.timezone.utc)
ts = int(dt.timestamp()) # 1774224000
Converting in SQL
-- PostgreSQL
SELECT TO_TIMESTAMP(1774224000); -- 2026-03-20 00:00:00+00
-- MySQL
SELECT FROM_UNIXTIME(1774224000); -- 2026-03-20 00:00:00
-- SQLite
SELECT datetime(1774224000, 'unixepoch'); -- 2026-03-20 00:00:00
Converting from the Command Line
# macOS
date -r 1774224000
# Linux
date -d @1774224000
# Get current timestamp
date +%s
Timezones: The Plot Twist
Here's the thing people miss: Unix timestamps are always UTC. They represent one absolute moment in time. The timezone only matters when you display it as a human-readable date:
const ts = 1774224000;
// Same timestamp, different timezones:
new Date(ts * 1000).toLocaleString("en-US", { timeZone: "UTC" });
// "3/20/2026, 12:00:00 AM"
new Date(ts * 1000).toLocaleString("en-US", { timeZone: "America/New_York" });
// "3/19/2026, 8:00:00 PM" (it's still yesterday in New York!)
new Date(ts * 1000).toLocaleString("en-US", { timeZone: "Asia/Kolkata" });
// "3/20/2026, 5:30:00 AM"
Store one number, display it anywhere. That's the beauty of timestamps.
The Year 2038 Problem (Yes, It's Real)
Remember that "uh oh" timestamp from earlier? Systems that store Unix timestamps as 32-bit signed integers will max out at 2,147,483,647 — which is January 19, 2038 at 03:14:07 UTC. One second later, the counter wraps around to negative numbers, and suddenly it's December 1901.
It's basically Y2K's little brother. Most modern systems already use 64-bit integers (good for another 292 billion years), but embedded systems, old databases, and some file formats are still vulnerable.
Notable timestamps to remember: 0 = Jan 1, 1970 (epoch). 946684800 = Jan 1, 2000 (Y2K). 1234567890 = Feb 13, 2009 (people threw parties for this one). 2147483647 = Jan 19, 2038 (32-bit overflow).
Practical Tips for Working with Timestamps
- Always store in UTC — Convert to local time only at the display layer.
- Use ISO 8601 for human-readable exchange —
2026-03-20T00:00:00Zis unambiguous and sortable. - Document your precision — Always note whether your API uses seconds or milliseconds. Future-you will thank present-you.
- Be careful with "one month from now" — Is that 28, 29, 30, or 31 days? Use a date library for calendar math.
- Test edge cases — Daylight saving transitions, leap years, and midnight boundaries love to cause bugs.
Try It Yourself
Convert any Unix timestamp to a human-readable date, or get the timestamp for any date. Supports seconds, milliseconds, and multiple timezone displays.
Open Timestamp Converter →