What boggles the mind is that this is a problem at all.
It seems hard to
believe (in retrospect) that people really did deliberately build software
with only 2 digit years. I know it saved a few bytes, and yes, I remember
when a byte of memory was a significant amount, but still. How did
standard
programming practice come to be so short sighted as to
assume that software
infrastructure would be thrown out and replaced on a regular basis?
--
Jim Strickland
What about OS8 (PDP8) that had a 3 bit year field. It has a Y2K issue every
8 years.
Dan