While this question ask been well answered, it does
bring to mind a
good old trivia question....
"What significant advantage did octal have over
hex notation
(especially in the late '60s timeframe)?"
Well, I wasn't around then (at least not with respect to computers).
But I'd hazard a few guesses.
- Word lengths that were multiples of 3 were commoner than word lengths
that were multiples of 4 (of course, some, eg 36 bits, were both).
- Existing devices (eg, nixies) could handle 0-7 better than 0-F.
- Using letters as "digits" ran into human mindset trouble; using
decimal representation for hex digits runs into bigger trouble,
using multiple characters per functional digit.
- Humans have trouble with a 16-digit system. (I know I do; I always
have to stop and pay attention to avoid getting B and D confused.)
How close did I come? :-)
/~\ The ASCII der Mouse
\ / Ribbon Campaign
X Against HTML mouse(a)rodents.montreal.qc.ca
/ \ Email! 7D C8 61 52 5D E7 2D 39 4E F1 31 3E E8 B3 27 4B