What I wonder is why people didn't just store the
date as decimal in those
two bytes. You can go up to the year 65535 that way. I'm positive there
were good reasons for not doing that as well. Its easy for us to look
back and scoff at our forebearer programmers.
I am not knowledegable in this area, but I would suggest that 2 BCD digits
(0-9) encoded in a single byte would possibly be the reason. 1 byte is
better than 2 :)
A