Chuck Guzis wrote:
I love
hacking with Z80s. I consider myself very lucky that much of
my work these days is done in the embedded systems world, where Z80s
(albeit at 50MHz with 24-bit address buses) are alive and well. It's
neat to see how it has made the transition from mainstream
general-purpose processor to common embedded processor. (and I'll bet
Zilog sure is happy about it!)
And yet, to my eye, the Z80 architecture is about as bad as the 8086.
Basically, a lot of instructions grafted onto an 8008 model.
Yep. But to be fair I think that lineage really started with the
8080; the 8008->8080 differences were much more significant than any
other step in the family. But I digress.
I have to say that I do love the Z80, but I think I "forgive" some of
its architecturally unclean aspects because it was the first
architecture that I learned assembler on.
There's a point of view that microprocessors were
a devolution in the field
of computer science, and I have to admit that it has some merit. Before
the 8008/8080, who was even fooling with single-user mono-tasking operating
systems as a serious effort? With mainfraimes we had graphics, database
management, multi-user, multi-tasking, advanced architectures and all
manner of work being done in basic computer science. Along comes the
microprocessor and the siren song of easy money causes some very bright
people to spend great chunks of their professional lives on MS-DOS, Windows
and other already-invented-for-mainframes stuff.
Yes! I've long held the belief that "the computer world" is split
into two completely different halves that don't even know of each
other's existence. There's the side that started in the 1940s and had
some of the world's most intelligent people working on them (which today
only lives on in the mainframe world) and then there's the part that
started in 1969 with a chip designed for a desktop calculator. The way
I see it, everything we have to day, except for the mainframe world, is
descended from the i4004...and as such, doesn't trace its lineage back
to the 1940's...but to 1969's development of a processor for a desktop
calculator.
The "already invented for mainframes" thing has big examples in the
minicomputer world as well. All this hoopla about SANs these days...we
had that on our VAXen in, what, 1980? This is nothing new!
In another thread, the discussion is centering around
Microkernels (been
there, done that on a mainframe) and the need to keep I/O drivers as part
of the kernel. Why? Why should the CPU even have access to I/O ports and
interrupts? Why doesn't every device have its own I/O processor with
access to CPU memory? Or better yet, why not a pool of communicating I/O
processors that can be allocated to the tasks at hand, leaving the CPU to
do what it does best? Is silicon now so expensive that we can't advance to
an idea that's more than 40 years old?
When I read this paragraph, I wanted to stand up and cheer. You and I
think very, very much alike in this area. If I didn't have to squander
my time working like a dog just to keep the electric service turned on,
I'd be building systems just like that. One of these days I'll get
myself out of the dysfunctional "american dream" lifestyle of borrowing
money to live beyond my means.
Forgive the rant, but other than bringing computers to
the masses, do
microcomputers represent a significant forward movement in the state of the
art?
Absolutely not, and I'd be really surprised if any experienced person
actually thought otherwise.
-Dave
--
Dave McGuire
Cape Coral, FL