Of course, these are the people who argued as to how
many digits you
really need to represent a year on the modern calendar. Hint: 2
digits only takes you so far... ;-)
100 years, which is longer than we've had electronic digital
computers--or longer, depending on the mechanism used to store the
field (i.e.,. BCD vs. binary vs. ASCII). An awful lot of legacy code
was changed by Y2K programmers to express the year as a 4 digit
field, instead of simply employing "wraparound" logic. But heck, at
$45/hour for a COBOL programmer back then, why not make the job more
complicated?
I think it was the $45 / word of memory that did it.
How many people coding in the late 1950's expected the same aps
would sill run 40+ years later?