The one thing
that strikes me after reading this thread is that it
seems there is no precise point when computing history began.
Huh? I thought it
began on January 13, 11 CE :-)
Any Unix type can tell you that history began at 00:00:00 January 1,
1970. But, they do allow negative timestamps, to mark any PRE-HISTORIC
events. It will "go bad" January 19, 2038. Did John Titor find his
magical 5100?
Mac history began January 1, 1904. For a long time, I couldn't figure out
what was special about that date, but finally realized that it was picked
over 1900 in order to avoid the leap year exception. Therefore, it will
"go bad" on March 1, 2100.
MS-DOS FAT sets the beginning of history at January 1, 1980. No provision
for negative times, but FAT12 and FAT16 (except >= NT) do permit negative
file sizes. (Unfortunately, that is incompletely implemented - when I
copied a -2GB file to an almost full hard drive, it did NOT increase the
free space!) It will "go bad" in 2108, although some claim that it "went
bad" on January 1, 1980.
The classic COBOL 2 digit year started in 1900, and "went bad" in 2000.
Shysters made money telling us that we had "trouble, right here in River
city!", planes fell from the skies, all electronics exploded, and
civilization collapsed.
Don't forget that the Mayan calendar, which is said to be marginally more
accurate than ours, will "go bad" on December 21, 2013.
Does the Antikytherian device have a start or end point?
> seems there is no precise point when computing
history began.
So, like Morrow's standards ("Everyone can have a unique one
of their
own"), there are MANY precise points when computing history began.
--
Grumpy Ol' Fred cisin at
xenosoft.com