At 02:58 AM 1/5/99 -0000, Eric Smith wrote:
Once databases started being kept on disk drives, saving two bytes for each
date may not have sounded like much, but it literally did save money. It
takes fewer disk packs (and perhaps fewer drives) to store 92-byte records
than 94-byte records. I don't know how many decisions were made on that
basis, but you can't realistically ignore it.
There's a dozen lame excuses as to why They Did It That Way. Few of
them make any sense. If they'd stored the year in a seven or eight
bits offset from their earliest year, instead of two ASCII or BCD
digits, they'd halve their storage requirements. There is a good
discussion of this at <http://language.perl.com/news/y2k.html>.
At 04:37 PM 1/4/99 -0800, Chuck McManis wrote:
I attended a talk by a Y2K consultant (Bruce Webster)
I remember him from a decade ago, writing for Byte, and doing something
(games?) with the Amiga. I seem to remember him at the Amiga dev cons
where I once met you, too. His resume at <http://www.bfwa.com/prof/> shows
he's been busy at other things, and has certainly been around for a while
in the world of computers. However, I'll always remain a bit skeptical
of opinions from anyone who's routinely paid to speak or write their
opinions. :-)
Your discussion of being FreeGate's Y2K rep reminded me of when I was
asked whether the software I developed was susceptible. At first
I thought "no way, not a thing." Then I remembered the node-locking
on the SGI version - nope, it wisely used an offset from 1900 and
stored it in enough bits. A few sprintf("%s") of ctime(), though.
- John