Epoch (reference date)

From Free net encyclopedia

In chronology, an epoch is an instant chosen as the origin of a particular time scale. The epoch serves as a reference point from which time is measured. Days, hours and other time units are counted from the epoch, so that the date and time of events can be specified. Events that took place earlier can be dated by counting negatively from the epoch. Epochs are generally chosen to be convenient or significant by a consensus of the time scale's initial users.

Contents

Calendars

Each calendar era starts from an arbitrary epoch, which is often chosen to commemorate an important historical or mythological event.

For example, the epoch of the current civil calendar is the traditionally-reckoned year of the birth of Jesus, defined as year number 1. Thus, the first instant of January 1, 2006 CE should be exactly 2005 years since the epoch, but quirks in the development of the modern Gregorian calendar make this technically incorrect.

The traditional Chinese calendar uses 2637 BCE, a date in the life of the legendary Yellow Emperor, as its epoch. Several other calendars are also currently in use, based on important historical events.

Astronomy

Template:Main In astronomy, an epoch is a moment in time for which celestial coordinates or orbital elements are specified. The current standard epoch is J2000.0.

Computing

In computers, time is often expressed as the number of seconds since midnight, Universal Time, on a conventional epoch defined by the operating system. Contrary to human calendars, computers usually start counting from 0 at the epoch instant. Famous epoch dates include:

System time is measured in seconds or ticks of arbitrary length past the epoch. Unspecified problems may occur when this number exceeds a predefined capacity, which is not necessarily a rare event; on a machine counting 10 ticks per second, a signed 32-bit count of ticks allows for only 6.8 years of accurate timekeeping. The 1-tick-per-second clock of Unix will overflow on January 19, 2038, creating the Year 2038 problem on systems that still store time as a 32-bit signed integer. David Mills, author of NTP, acknowledges that the protocol's ultra-precise 64-bit timestamps will roll over on February 6, 2036 and advises that:

Should NTP be in use in 2036, some external means will be necessary to qualify time relative to 1900 and time relative to 2036 (and other multiples of 136 years). (quoted from RFC1305)

The evolving definition of official time over history introduces more subtle problems for computer-based linear representations. Leap years and the Gregorian calendar are generally taken into account, but leap seconds are more challenging due to their non-linear rate of past occurrences and the impossibility to accurately predict their future occurrences. These complications are discussed at length in the Unix time article.

Trivia

January 1, 1904, was chosen as the base for the Macintosh clock because it was the first leap year of the twentieth century. [...] This means that by starting with 1904, Macintosh system programmers could save a half dozen instructions in their leap-year checking code, which they thought was way cool.
  • The epoch in Microsoft Excel is January 1, 1900 but dates before March 1, 1900 are inconsistent with reality, as 1900 was erroneously considered a leap year. This bug was introduced intentionally in order to maintain compatibility with market leader Lotus 1-2-3, and the bug has lived on to this day. Designers of Lotus 1-2-3 had probably chosen this simplified behaviour in order to save some precious processing time and program space. Little did they know that their future competitor's dominant spreadsheet program would still interpret those dates incorrectly more than 20 years later.

External links