I'd have to agree with Andy - up to a point. My aunt was senior
programmer at Harrow borough council for years. I'm 46 now, but when I
was 11, I remember my aunt discussing with my grandfather, the
decommissioning of the computer that the had become so familiar with.
From memory it was an English Electric Co. "LEO" (Lyons Electronic
Office) machine, I think mostly built from vacuum tubes/valves, and
dependent on card punches and readers for much of it's input. I
remember seeing guys in white coats tending the machine in its less
effective moments, and the general air of respect and authority they
commanded from the programmers and operators. About 10 years later, I
got my first job in the industry as a field engineer for Data General.
I remember I spent some time explaining to my aunt how I'd written a
simple disk subsystem diagnostic which I'd keyed in through a 'virtual
console' in hex. I then had to explain to her what a virtual console
was, followed by an explanation of why parts of the machines she knew
so well had to be re-wired as part of the programming process.
The aim of a business is to make money - and most would agree that
homo-sapiens is essentially a lazy creature (at least, I am)! In terms
of hardware and software design and production, this means more and
more layers of abstraction between the basic computational structures
and the functions that users desire. Doesn't it seem like a natural
progression then, that not only do fewer and fewer people understand
less and less about machine and system architecture, but given
sufficient time, the nature of the IO devices and operations has/is
becoming so different from the earliest systems, that it is becoming a
non-trivial task to understand even their function, let alone their
correct modus operandi (e.g. how many hardware engineers today would
recognise, let alone understand the detail in the design of a mercury
delay line)?
It seems to me that there is nothing at all wrong with this, its just
that times/fashions change, and so do our reasons for doing things. On
the other hand, the reasons given by the perpetrators of software bloat
and the apparently unconnected increasing hardware complexity that
always seems to accompany it (I'd often prefer the term sophistry) are
highly suspect.......
Tim
On Sunday, May 25, 2003, at 07:09 AM, Andy Holt wrote:
different today, how can people relate to this? While
there are
still a few
people who know how to make a horseshoe at a
blacksmiths there will
be
nobody who knows how to run the early mainframes in 50 years, things
are
Rubbish!. Are you seriously trying to tell me that these skills can't
be
learnt? ... <below
-tony
Well, understanding these computers in an architecture sense is one
thing* -
but operating and maintenance skills were usually verbally transmitted
and
rarely permanently documented. They can probably be redeveloped with
experience but this experience is likely to be at the cost of media
damage.
When I was a systems programmer on an ICT1905 (then a 1905E etc) I
learned
some of the skills of the operators and even occasionally helped the
engineers (usually to diagnose processor problems with "odd" symptoms)
Things like:
the skills of handling trays of punched cards so that the readers
didn't
jam - and the practice of clearing the jams that did happen. (a 1600
cpm
reader can fling cards all over the place when it feels like it! - the
600
cpm reader was referred-to as "the mangler" by anyone who used it ...
and
with good reason)
loading magnetic tape drives - the upright ones with vacuum columns
(yeah,
it's trivial ... not!)
handling "washing-machine" exchangeable disk packs
ensuring fanfold paper stacks correctly from a fast line-printer
having the reactions to hit the stop button on that printer when it
starts
page-throwing at maximum speed.
recognising the progress of jobs from the sound of the console
loudspeaker
knowing which boards to tap (and how hard) when doing "preventative
maintenance" listening to that loudspeaker while running the
diagnostics
programs (and those programs were probably the very first things to get
thrown out when the computer was replaced)
switching things on (and off) in the correct order (and how to start
the MG
set)
even the bootstrap sequence wasn't as simple as on a PDP-11
I would claim that anybody who _truely_
understood a modern
machine would have no problems on an older one. The fact that very few
people understand modern computers is the problem, not that the older
machines are so different.
I would put this the other way round - people who had a good
understanding
of the old machines have some chance of getting a deep understanding of
modern ones. In terms of the Instruction Set Architecture and
programming
there are few difficulties in understanding one given the other (but
for
'minor' things like self-modifying code and the concept of overlays).
On the
other hand, the lower level descriptions of the processor logic
typically
use terminology that is totally foreign to the modern logic designer -
not
to mention the implicit "wired-or" that is frequently used and not
explicitly mentioned in the documentation or that with only a small
number
of logic gates per card techniques were used to minimise the number of
gates
that would never be seen on modern synchronous logic.
Andy