Tasks that used to (meaning 20 or 30 years ago) used
to require a single PDP-8 or PDP-11 class minicomputer now use dozens
to hundreds of PC-clone's to do the same functions.
You mean a single computer with lots of terminals or teletypes, or
punched card machines.
: WHY DOES IT TAKE 100 COMPUTERS TO DO WHAT
A SINGLE COMPUTER USED TO DO?
We could go back to secretarial pools and reduce the number of
computers. Or we can go back to the days of timesharing and give up
all the wonderful user oriented capabilities that a modern UI offers,
but if you include the end terminals in the comparison then it was
not really a single computer in my mind.
Does anyone know the power difference of the iMac I am typing on
right now vs. an old IBM 3279 terminal (which, by the way, was
connected to an incredibly inefficient IBM mainframe with a room full
of spinning disk machines)? Or the cost of an old Dec terminal? I'm
curious how much additional power I am really consuming.
The IBM 3080 that I worked on in 1983 supported a few hundred users.
It gobbled electricity. I bet if you added the mainframe and all the
supporting equipment for it (network controllers, terminals, DASD,
printers, etc.), 300 modern PCs are cheaper. But this is merely a
guess on my part. Personally, I would like to see real figures before
I could buy in to your argument.
To me, the problem is merely expansion. Millions and millions of users.
Paul