On Wednesday 19 April 2006 09:55 pm, Don Y wrote:
But the Z80 isn't an 8085 nor is the 8085 an 8080.
(granted, the
last two are much closer related than the first two).
But the Z80 and the 8085 are both based on the 8080 architecture
and instruction set - so much so that they will both run the vast
majority of 8080 code.
But that's a fallacy. You have to tweek the code in almost
all cases (especially if you are designing embedded systems
and not "desktop applications"). So, a smarter approach is
to handle things at the *source* level instead of the *object*.
Feeling like I'm missing something here...
What are these differences you refer to? I have some vague recollection of
having heard of these someplace before, but the details are escaping me at
the moment.
<...>
But a Z80 *won't* run 808[05] code.
?
<...>
I'd have rathered some of the opcode space spent
on more useful instructions
-- short load-immediates (where the argument is encoded in the first byte
of the opcode), etc.
I too felt that some things could have been done differently. :-)
<...>
When I was doing Z80-based designs (in the "split
octal" world), a helluva
lot of energy was expended to support the "octal" encoding -- rewriting the
Zilog assembler to generate listings in octal (INCLUDING displaying
addresses in split octal!), building run-time "monitors" to examine and
patch code images during execution, writing the associated software to do
so, etc.
I'm convinced the fact that you could get a ten-key keypad and an
inexpensive LCD to display 6 digit SPLIT octal values had more of an impact
on the octal decision than anything about the opcode encoding -- despite the
fact that it made USING the tools more difficult (since you can only
display a 16 bit value -- *address* -- in 6 digits, you have to multiplex
the display to let the user see the *data* at that address :< ).
I really do believe that the choices of what you could get for display
hardware and simple decoding chips (hence my recent mention of the 7446 not
decoding hex) influenced a lot of things in that direction, with for example
the Heath H8 working that way too.
Though I have NO idea why the H11 used octal...
I recall getting an EM-180 and quickly distancing
myself
from the octal vs. hex debate... I'll use *symbols* instead
of dicking around with bit groupings.
Works for me. One of the reasons I parted with that 1802-based system I had a
while back, it had software, but the only way to get it into the machine
was by punching in a whole lot of hex digits on a keypad, a prospect I found
sufficiently daunting that I never even powered that machine up in all of the
years I had it.
Now why some
people chose octal for other processors, which
didn't have an architectural slant toward 8 is more of a mystery
to me....
I think the fact that 0..7 fits in a decimal representation says a lot about
the "why". :-( Amazing to consider how much "resources" got wasted
on
silly (in hindsight) decisions. Sort of like PC (and other) BIOS decisions
placing silly restrictions on the size of a disk or where the boot code
can be located, etc.
Yep!
--
Member of the toughest, meanest, deadliest, most unrelenting -- and
ablest -- form of life in this section of space, a critter that can
be killed but can't be tamed. --Robert A. Heinlein, "The Puppet Masters"
-
Information is more dangerous than cannon to a society ruled by lies. --James
M Dakin