Unix Time

Most unix kernels measure time in seconds since TheEpoch?, usually the first moment of 1970 in GMT

As a DateStamp, that is 19700101.000000000000000000000000000000000

Most Unix systems on the internet synchronize to the millisecond, although they may count many more subintervals -- a Gigahertz machine should know which nanosecond some interrupt happened.

ThirtytwoBitTime? when treated as a SignedInteger? counting seconds since 19700101.000000 rolls over and becomes negative at 20380119.031408 resulting in a date early in the TwentiethCentury? 19011213.204552

Apparently EightLeapSeconds intervened between those two dates -- as far as Unix Knows Today


EditText of this page (last edited October 17, 2005) or FindPage with title or text search