Skip to main content
Back to blog

Unix Timestamps: What They Are and How to Convert Them

Learn what Unix timestamps are, why they use January 1 1970 as the epoch, and how to convert between timestamps and human-readable dates.


What Is a Unix Timestamp?

A Unix timestamp (also called Unix time, POSIX time, or epoch time) is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC. This reference point is called the Unix epoch.

Right now, the Unix timestamp is a 10-digit number somewhere around 1,770,000,000. Every second, it increments by one. It is a simple, unambiguous way to represent a point in time as a single integer.

Why January 1, 1970?

The Unix operating system was developed in the late 1960s at Bell Labs, and its creators needed a fixed reference point for their time system. They chose the start of the 1970s as a round, convenient date close to when the system was being built. The choice was pragmatic, not symbolic, and it stuck — becoming one of the most widely used time references in computing.

Why Timestamps Are Useful

No Timezone Ambiguity

A Unix timestamp represents an absolute moment in time. The integer 1770000000 means the same instant everywhere on Earth. There is no timezone offset, no daylight saving confusion, no locale-dependent formatting. This makes timestamps ideal for storing and transmitting time data between systems.

Easy Arithmetic

Calculating the difference between two points in time is simple subtraction. The duration between timestamps 1770000000 and 1770003600 is 3600 seconds, which is exactly one hour. Adding 86,400 to a timestamp moves forward exactly one day.

Compact Storage

A 32-bit integer can store a Unix timestamp in just 4 bytes. Compare that to a string like "2026-02-07T12:00:00+00:00" which uses 25 bytes. For databases and log files handling millions of records, this difference adds up.

Seconds vs. Milliseconds

The traditional Unix timestamp counts seconds. However, many modern systems (JavaScript, Java, and most web APIs) use millisecond timestamps — the number of milliseconds since the epoch. A millisecond timestamp has 13 digits instead of 10.

To convert between them: - Seconds to milliseconds: multiply by 1,000. - Milliseconds to seconds: divide by 1,000 and round down.

In JavaScript, Date.now() returns milliseconds, while most server-side languages and databases use seconds.

Converting Timestamps to Dates

The Formula

To convert a Unix timestamp to a date, you add the timestamp (in seconds) to the epoch date and let date libraries handle the calendar math (leap years, month lengths, etc.).

In JavaScript

const timestamp = 1770000000;
const date = new Date(timestamp * 1000); // JS needs milliseconds
console.log(date.toISOString()); // "2026-02-07T..."

In Python

from datetime import datetime, timezone
timestamp = 1770000000
dt = datetime.fromtimestamp(timestamp, tz=timezone.utc)
print(dt.isoformat())

In a Terminal

# macOS

# Linux date -d @1770000000 ```

Converting Dates to Timestamps

The reverse conversion takes a date and calculates the seconds since the epoch.

In JavaScript

const date = new Date("2026-02-07T00:00:00Z");
const timestamp = Math.floor(date.getTime() / 1000);

In Python

from datetime import datetime, timezone
dt = datetime(2026, 2, 7, tzinfo=timezone.utc)
timestamp = int(dt.timestamp())

The Year 2038 Problem

Traditional Unix timestamps are stored as signed 32-bit integers, which can represent values up to 2,147,483,647. That maximum corresponds to January 19, 2038, at 03:14:07 UTC. After that moment, a 32-bit signed integer overflows, potentially wrapping around to a negative number that represents a date in December 1901.

Most modern systems have migrated to 64-bit timestamps, which will not overflow for approximately 292 billion years. However, legacy embedded systems, IoT devices, and older databases may still be vulnerable. If you maintain such systems, auditing timestamp storage is important before 2038.

Leap Seconds

Unix time does not account for leap seconds. A Unix day is always exactly 86,400 seconds, even though astronomical days occasionally have a 61st second. When a leap second occurs, Unix time either repeats a second or the system uses a "smearing" technique that slightly adjusts the clock speed around the event. For most applications, this is invisible. For high-precision timekeeping (scientific instruments, GPS), it matters.

Try It Yourself

Our Unix Timestamp Converter lets you paste any Unix timestamp and instantly see the corresponding human-readable date and time, or enter a date and get the timestamp. It handles both seconds and milliseconds and supports multiple timezone displays.

Summary

Unix timestamps are a universal, unambiguous, and compact way to represent time. They power databases, log files, APIs, and cron jobs across virtually every computing platform. Understanding how they work — and the edge cases like the 2038 problem and leap seconds — helps you work confidently with time data in any system.