On 8/6/2006 at 11:04 AM Dave McGuire wrote:
I love hacking with Z80s. I consider myself very
lucky that much of
my work these days is done in the embedded systems world, where Z80s
(albeit at 50MHz with 24-bit address buses) are alive and well. It's
neat to see how it has made the transition from mainstream
general-purpose processor to common embedded processor. (and I'll bet
Zilog sure is happy about it!)
And yet, to my eye, the Z80 architecture is about as bad as the 8086.
Basically, a lot of instructions grafted onto an 8008 model.
There were some architectures that held real promise back in the early days
that never went anywhere; the GI CP1600, for example. A nice, reasonably
orthogonal set of 16-bit registers witha straightforward instruction set.
But I'll be the first to admit that GI was ham-handed about it--10 bit
instruction width; poor I/O performance, etc. Most MPU instruction sets
were cobbled together to fit within the constraints of silicon technology.
Most mainframes of the time had rather elegant, straighforward instruction
sets.
There's a point of view that microprocessors were a devolution in the field
of computer science, and I have to admit that it has some merit. Before
the 8008/8080, who was even fooling with single-user mono-tasking operating
systems as a serious effort? With mainfraimes we had graphics, database
management, multi-user, multi-tasking, advanced architectures and all
manner of work being done in basic computer science. Along comes the
microprocessor and the siren song of easy money causes some very bright
people to spend great chunks of their professional lives on MS-DOS, Windows
and other already-invented-for-mainframes stuff. Right now, I figure we're
somewhere in OS development on micros about where we were with a Spectra 70
running VMOS.
In another thread, the discussion is centering around Microkernels (been
there, done that on a mainframe) and the need to keep I/O drivers as part
of the kernel. Why? Why should the CPU even have access to I/O ports and
interrupts? Why doesn't every device have its own I/O processor with
access to CPU memory? Or better yet, why not a pool of communicating I/O
processors that can be allocated to the tasks at hand, leaving the CPU to
do what it does best? Is silicon now so expensive that we can't advance to
an idea that's more than 40 years old?
Forgive the rant, but other than bringing computers to the masses, do
microcomputers represent a significant forward movement in the state of the
art?
Cheers,
Chuck
I remember the Adam...what a neat machine. A friend in high school
had one. I remember being utterly fascinated by those cassette
drives...cassettes as a file-structured medium? Neat!
-Dave
--
Dave McGuire
Cape Coral, FL