On what operating systems is it possible to set the system time before 1970?
That's a question I had after reading this example from the Rust standard library documentation:
use std::time::{SystemTime, UNIX_EPOCH};
match SystemTime::now().duration_since(UNIX_EPOCH) {
Ok(n) => println!("1970-01-01 00:00:00 UTC was {} seconds ago!", n.as_secs()),
Err(_) => panic!("SystemTime before UNIX EPOCH!"),
}
Here's an explanation of what's going on:
-
SystemTime
is astruct
that stores timestamp information. -
SystemTime::now()
returns the current reading of your computer's clock as aSystemTime
. -
If \(x\) and \(y\) are
SystemTime
s, then \(y\).duration_since(
\(x\))
returns aResult<Duration, SystemTimeError>
:-
If \(y\) occurs at the same time or after \(x\), then
duration_since
will returnOk(
\(n\))
, where \(n\) is aDuration
representing the amount of time between \(x\) and \(y\). -
If \(y\) occurs before \(x\), then
duration_since
will returnErr(
\(z\))
, where \(z\) is aSystemTimeError
.
-
-
UNIX_EPOCH
is a constant of typeSystemTime
that represents the Unix epoch, that is, 1970-01-01T00:00:00 UTC, the exact start of the year 1970 in Coordinated Universal Time (which is roughly the same as Greenwich Mean Time). -
So,
SystemTime::now().duration_since(UNIX_EPOCH)
returns either:-
Ok(
\(n\))
, where \(n\) is the duration between now and the Unix epoch, if the system clock is set to the Unix epoch or afterwards, or -
Err(
\(z\))
, where \(z\) is aSystemTimeError
, if (for some bizarre reason) the system clock is set to before the Unix epoch.
-
-
If \(n\) is a
Duration
, then \(n\).as_secs()
will return the number of seconds in \(n\), rounded down to the nearest integer.
So the overall behavior of this code is:
-
If the system clock is set to the Unix epoch or afterwards, output to the standard output something like "1970-01-01 00:00:00 UTC was 1745265600 seconds ago!"
-
If the system clock is set before the Unix epoch (for some bizarre reason),
panic
(i.e., crash) with the error message "SystemTime before UNIX EPOCH!"
So, my question was, after reading this: On which operating systems is it possible to set the system clock to before the Unix epoch?
The reason I asked is that on Unix and Unix-like systems (such as MacOS, iOS,
Android, Linux), the system time is actually represented internally as the
time since the Unix epoch—this is usually referred to as "Unix time".
For example, this post is timestamped to 2025-04-21T15:00:00-05:00 (April 21,
2015 at 3:00 p.m., in a timezone five hours behind UTC, which in my case is
North America's Central Daylight Time), but that can also be represented in Unix
time as 1745265600 seconds (or, in hexadecimal, 0x6806a3c0
seconds, which my
blog is currently configured to display at the top of this post).
Now, Unix time is usually represented (in seconds1) as a signed integer2, making it possible to represent dates before 1970 (e.g., the birthday of anyone currently aged 56 and higher). However, a number of systems use the (reasonable) assumption that the current date is in 1970 or later. See, for example, this question from someone who is trying (and failing) to set their computer to a date before 1970.
A hint to the answer to my question is in a table on Wikipedia, which lists the epochs used by various operating systems and computer languages. For example, modern Windows systems use the beginning of the year 1601. I knew Windows used a different epoch, but I didn't realize it was before 1970 until I read this table.
Still, it would be interesting to see which of these operating systems actually allow the system time to be set that far in the past.
-
Leap seconds are ignored; if leap seconds were taken into account, it would be impossible to precisely specify future dates in Unix time more than about six months in advance. ↩
-
Early systems stored the Unix time as a signed 32-bit integer, which can represent the values between -231 and 231 - 1 (i.e., roughly ±2.1 billion), a timespan of roughly ±68 years from the Unix epoch. Unfortunately, this leads to the Year 2038 problem (the binary cousin of the Year 2000 problem), in which the latest datetimes you can represent are in January of 2038. Quoth Wikipedia,
Modern systems and software updates to legacy systems address this problem by using signed 64-bit integers instead of 32-bit integers, which will take 292 billion years to overflow—approximately 21 times the estimated age of the universe.