At 02:58 AM 1/5/99 -0000, Eric Smith wrote:
Once databases started being kept on disk drives, saving two bytes for each
date may not have sounded like much, but it literally did save money...
There's a dozen lame excuses as to why They Did It That Way. Few of
them make any sense. If they'd stored the year in a seven or eight
bits offset from their earliest year, instead of two ASCII or BCD
digits, they'd halve their storage requirements. There is a good
discussion of this at <http://language.perl.com/news/y2k.html>.
I wrote a response to that very discussion. It was summarily ignored. The
gist of it was to refute the author's assertion that "they couldn't have
done it to save two bytes." Look at the design of an IBM 1401 sometime.
It has BCD-oriented memory. It's 8-bits, but four of them are the BCD
representation of the bottom 10 rows of a punch card, two bits are for the
top two rows, then two other bits for out-of-band control of the data.
The 1401 was not built to manipulate binary data. It would be incredibly
complicated to do so on such a machine. You could never just store raw
binary in memory.
Not all machines do things the way we do them today. Leave the nice
orthogonal 8/16/32-bit world and you see people doing interesting things
to get the job done. It's one of the reasons I love classic computers;
you have to think differently to make them work.
-ethan