Jim Strickland <jim(a)calico.litterbox.com> wrote:
What boggles the mind is that this is a problem at
all. It seems hard to
believe (in retrospect) that people really did deliberately build software
with only 2 digit years. I know it saved a few bytes, and yes, I remember
when a byte of memory was a significant amount, but still.
Yes, but do you remember back when 80 characters was the total amount you
had for an entire "database" record? That's the origin of the problem.
And although I don't know of any software from those days (pre-1960) still
running, there are many systems that evolved from those origins.
Once databases started being kept on disk drives, saving two bytes for each
date may not have sounded like much, but it literally did save money. It
takes fewer disk packs (and perhaps fewer drives) to store 92-byte records
than 94-byte records. I don't know how many decisions were made on that
basis, but you can't realistically ignore it.
Of course, it would still save money to do that today, but instead of
saving about two cents per date (IBM RAMAC, 1956) it is now one cent per
million dates (Quantum FB312700A 12.7G drive for $249 from Dirt Cheap Drives).
Eric