At 05:25 PM 1/4/99 -0700, you wrote:
What boggles the mind is that this is a problem at all.
It seems hard to
believe (in retrospect) that people really did deliberately build software
with only 2 digit years. I know it saved a few bytes, and yes, I remember
1,000,000 records (a small sample) x 2 bytes x 3 dates (birthdate, initial
trx, most recent trx) = 6,000,000 bytes. That was a lot 20 years ago.
Keep in mind, also, that reading in a record required enough workspace to
hold the record, and this may have been a limiting factor as well. Not to
mention bandwidth issues -- we may not think much of downloading a 6MB file
now, but imagine doing that 20 years ago with a 300bps modem.
when a byte of memory was a significant amount, but
still. How did standard
programming practice come to be so short sighted as to assume that software
infrastructure would be thrown out and replaced on a regular basis?
Today it is. We keep hoping that one of these days Microbloat will get it
right... 8^)
But seriously, 20+ years ago, it wasn't so much that it was expected that
it would be replaced as it was the idea that if they can get the software
out the door now, they may still be around to fix the problem later,
whereas if they hold off, they may not be around.
It was also a self-perpetuating problem. No point in building a 4-digit
date if your source data (internal or external) is 2-digits. In fact, a
4-digit program might fail if given 2-digit data.
There are lots of reasons, many of them valid (at the time).
--------------------------------------------------------------------- O-
Uncle Roger "There is pleasure pure in being mad
roger(a)sinasohn.com that none but madmen know."
Roger Louis Sinasohn & Associates
San Francisco, California
http://www.sinasohn.com/