It is obvious that we disagree on my basic premise - that internally,
sufficient
bits be available to represent the date starting at a date far enough in
the past
to handle many old dates in the countries which would use the operating
system. Some operating systems such as VMS achieved most of that
goal. On OS/8, Unix and RT-11, as just a few examples, it would have
required a decision to double the number of bits over what was actually
used.
For OS/8 and RT-11, the simple solution would have been to use half a
word for the month and day (not sure what half a word is called on a PDP/8)
and a full word for the year.
Probably, the question of having adequate support for the date was the
real decision. I have seen multi-million dollar systems which ignore the
date (like a micro-wave) and only provide the time. But if there is a
disk drive with files and dates to be noted when the file was created
and at least modified (and perhaps referenced), then there can always
be a reason to use dates on some files which precede the start of the
operating system - to note when the information within the file was
actually created is just one possibility.
If this above basic premise is not considered adequate to justify allocating
a word in memory for the year rather than 5 bits in the case of RT-11
(eventually 7 bits to manage dates from 1972 until 2099) and however
many bits were used in OS/8, then there is really no point in this
discussion.
In respect of such vital information as the date, saving a few bits was not
justified in my opinion. The fact that VMS arrived later than OS/8 and
RT-11, but allowed dates much before VMS was ever released suggests
that DEC had begun to realize that the date was important. The same
with RSX-11.
Johnny Billquist wrote:
> >On 2012-07-13 19:00, "Jerome H. Fine" <jhfinedp3k at compsys.to>
wrote:
>
>>
Johnny Billquist wrote:
>>
>>> >[Snip]
>>> >Ooo. So TECO-8 actually lie in their documentation... Even worse.
>>> >A year in the range 1986-1994 would just have looked like 1970-1977.
>>> >That's ugly of them.
>>
>> What seems even more evident to me is that DEC took
>> (as most other companies did as well) the attitude that
>> even the internal representation of date and time was
>> not important enough to allow the same information to
>> be exchanged between operating systems on a consistent
>> basis.
>
> Sorry, but I fail to see the point. The internal representation of a
> date will almost by necessity be different between different OSes and
> hardware. Having a bunch of 16 bit values represent a date on a PDP-8
> would be incredibly stupid and difficult, not to mention that OS/8
> have no concept of time to more detail than a day. And even that needs
> to be updated manually every day.
Your assumption differs from what I would suggest - see above.
So, ignoring the internal format, which can't
really be portable
anyway, you then get to representation. There will always be dates
that cannot be represented in whatever format you choose. So what is
the point of bringing up that argument? It is nice if the dates that
you might reasonably expect to be processes be possible to express on
the system. As for communicating with other systems, in the
communication I would suspect/expect that you use an intermediate
format (a nice text string for example) that both agree on. And then
you can convert from the internal format to and from this intermediate
format, as long as the date is within a range expressable on that system.
That would be one solution, but other solutions are available. The
most important point is to have sufficient internal bits to start with.
When you go outside the date range for the system, you
can either try
to do something reasonable, or give an error. I think that is a choice
that is best left to the writers of the code to decide on a case by
case basis.
I agree - the point I attempt to make is to stress the requirement
to have sufficient internal date range to start with - see above.
By the way, Unix express time as a number of seconds since Jan 1,
1970, 00:00 UTC.
Which I think provides a maximum year of 2038 if 32 signed bits
are used. Far better would have been to use 64 bits with a
higher resolution such as milliseconds and start at least as far
back as 1900.
And time is horribly complex. You know that even if we
keep it fairly
modern, different countries switched from Julian dates to Gregorian
dates at different times, the last being Russia, in the early 20th
century.
I am well aware of the nature of how complex the subject of dates
and time can be. To start with, the Common Era (aka Gregorian)
Calendar is not used as the primary date representation by even a
majority of the Earth's population. However, with respect to using
computers, the countries and societies which first started to use
computers and the internet primarily use the Common Era Calendar
and I would suggest that the only dates which early computers did
support were Common Era (Gregorian) dates. I doubt that time
zones were even considered at that early stage, let alone considered
important enough to support internally, let alone externally. And
even if Russia was considered important enough as a market for
computer systems, I doubt that anyone would have remembered
that the country adopted the Gregorian Calendar less than a
hundred years prior.
As for just how complex time actually is, the biggest problem
that few people realize is that the length of the year can vary by
many minutes from the average length of the year. I seem to
remember that the variation can be about 15 minutes. Every
so often, a leap second has been used to keep clocks synchronized
with astronomical time (that UTC you mentioned above). The
average person completely ignores that complexity, but now
with the internet, operating systems must know the time very
accurately. But the date is a completely different matter. At
any one location, the date changes, on average, every 86,400
seconds - except when a leap second occurs - and having a
few extra bits internally to handle the year is, in my opinion,
essential.
So the initial decision for the internal representation of the date
was the base date or earliest date to be represented. Many,
probably a majority of operating systems used a very restricted
choice which reflected the emphasis on using as little storage as
possible. In my opinion, that was a bad choice.
If you really think that you can come up with a
reasonable, portable
design, that is "universal", I think I know of a few organizations
that would like to hear from you.
A "universal" design would be to use Julian Days (just a count of
the number of days from an agreed upon date - which has already
been accepted by astronomers) instead of dates. Then, convert from
Julian Days to whichever local calendar is favoured by the party
using the date. But that has already been rejected by most societies
and only a small percentage of the population are even aware that
Julian Days exist. There are too many reasons for most societies
to not use Julian Days - mostly religious in nature which requires
a seven days cycle to support a holy day every seven days. The
French attempted to start a new calendar, but failed - probably
mostly on the seven day cycle.
Until then, I'm pretty much satisfied with things
the way they already
are. Yes, OS/8 have been broken for 10 years now. But to fix it
require more than just changing the internal storage for the date.
RSX got fixed, and depending on which bits you look, it might stop
working right 2070(?), 2099, 2155 or 34667.
I don't know about RT-11, but I do know that RT-11 is totally separate
from RSX, and any problems are not shared, but unique.
I noticed you are aware of the TECO discussion on dates for
a number of different operating systems, so you probably
already know.
RT-11 will break on January 1st, 2100 in the same manner as
OS/8 has already broken - there are no more bits in the date
word to handle more than a range of 128 years starting in 1972.
The only solution is the add additional storage to hold more bits
for the year. Thus far, I know of only one other individual
(who supports RUST) who is serious about adding those extra
bits to the RT-11 file system, but there is no agreement as yet
about how to do that. If no one else will even discuss the question
of how dates for files in RT-11 can be used and extended beyond
the year 2100, then eventually a unilateral decision will be made -
or not as the case will be. Since it is probably not going to be
very important for RT-11 to support dates after 2099 except
for someone running RT-11 on an emulator (will any actual DEC
hardware still be running after 2099), the matter is probably only
of academic interest or only for hobby users - if any still exist
in 2100.
Jerome Fine