On Mon, 4 Jan 1999, Jim Strickland wrote:
What boggles the mind is that this is a problem at
all. It seems hard to
believe (in retrospect) that people really did deliberately build software
with only 2 digit years. I know it saved a few bytes, and yes, I remember
when a byte of memory was a significant amount, but still. How did standard
programming practice come to be so short sighted as to assume that software
infrastructure would be thrown out and replaced on a regular basis?
With the rate of change in the computer industry, who 30 years ago would
have thought that their programs would still be operating? Its seemed
stupid and silly to think they would. The driving factor was the fact
that 1K 30 years ago was like 50gigs today (actually, I don't think a
comparison on memory even applies anymore today...memory is almost an
inifinte commodity these days). Suffice it to say if you could save bytes
then you did, and 30 years is like a long time (in 30 years I'll be a
middle-aged dufas in a bad (19)90's suit chasing 20-something year old
women).
What I wonder is why people didn't just store the date as decimal in those
two bytes. You can go up to the year 65535 that way. I'm positive there
were good reasons for not doing that as well. Its easy for us to look
back and scoff at our forebearer programmers.
Sellam Alternate e-mail: dastar(a)siconic.com
------------------------------------------------------------------------------
Always being hassled by the man.
Coming in 1999: Vintage Computer Festival 3.0
See
http://www.vintage.org/vcf for details!
[Last web site update: 12/27/98]