Dave Dunfield wrote:
When I was
developing Z80-based products, an ongoing *battle*
was the use of hex vs. "split octal" (e.g., 0xFFFF -> 0377 0377).
The octal camp claimed the Z80 was an "octal machine" (oh, really?)
and, for "proof", showed how so many of the opcodes could be
committed to memory just my noting the source & destination
register "codes" and packing them into an octal representation:
xx xxx xxx (of course, I wonder how well their argument would
stand up if Zilog had opted to encode the register fields
as: xs dds dsx?? :> )
I'm not promoting the Octal side (indeed I much prefer HEX), however
Zilog didn't "opt" for anything - they based their design and instruction
set decoding on the Intel 8080, which was laid out in a manner which
made sense with "Octal". And Intel DIDN'T use xsddsdsx, they DID use
xxdddsss - which made perfect sense from an Octal standpoint (which
is why so many people promoted the use of Octal with it).
But the Z80 isn't an 8085 nor is the 8085 an 8080. (granted, the
last two are much closer related than the first two).
And, there is no reason why xx ddd sss is any *better* than
xs dsd sdx or sd xxd ssd for an instruction encoding. *We*
used (split) octal because our MTOS supported hot patching
and it was convenient to "hand assemble" code patches on the
fly to fix bugs, etc. (gdb wasn't around for an 8080 in ~1976)
Octal? Hex?
Just give me a symbolic debugger and let *it*
keep track of these minutae...
By the time the Z80 was common, so were good assemblers and
debuggers - but when the 8080 (ie: this instruction set) was designed,
there was no such software commonly available for "personal use" -
chances are your assembler was a pad of paper, a pencil and the Intel
databook (thats how I wrote my earliest software). More often than not
the debugger was a set of binary switches, or if you were really high-
tech, a very simple poke into memory monitor squeezed into a 1702.
My first coding job was porting a 4004 based product to the 8080
(actually, the 8085 came out before we were done so that became
the target -- no real differences from the software standpoint)
The only "symbolic" thing about it was the
pad of paper (sometimes
it was an integrated symbolic assembler/debugger, where you wrote
out the code on one side of the pad, the hex opcodes in the middle
and patches/debug notes on the right - of the same pad).
Thats why a lot of people opted for Octal with the 8080 - in Octal
the instruction set was fairly easy to remember ... For example
'1sd' gave you all the MOV combinations. If you had the instruction
set and it's encoding committed to memory, you could go a LOT
faster during the "assembly phase". It also made debugging
Sure! And for the 4004 we carried a small sheet of paper
neatly folded in half and tucked in our wallets.
easier. I never got to like Octal, so I worked in hex
anyway - I
made "cheat sheets" to look up the opcodes, which I eventually
memorized - I could go faster than most of the "Octal" guys...
Ah for the days of toggling in bootstrap loaders
with front
panel switches.. :-/ (at least bigger machines treated octal
as "real" octal and not this "spilt octal" nonsense...)
Agreed - one of the things I disliked most about Octal was the
ambiguity over representation of 16-bit quantities, and the fact
that you had to either "convert" to do it "right", or put up with a
little 2/3 digit in the middle (which made mental arithmetic
challenging) - The fact that you could just "stick two hex bytes
together" was reason enough for me to keep with it.
Exactly. Hence the advantage of things like the Nova over
the *80 devices -- you *know* you're dealing with words
instead of trying to cover both possibilities (with "split" octal).
I can still hear myself mumbling "High Low 377 377" (r.hl <- 0xFFFF)
BTW - something I forgot to mention in my earlier
post, Mits
wasn't going against "Everyone else" - there were a number
of other companies which embraced Octal for the 8080. Heath
is another "biggie", but I saw a good chunk of other Octal based
equipment during the time period.