Tony Duell wrote:
You only need the digits 0-7. At one time there was considerable
resistance to using letters as digits, a second-hand book I bought last
week thinkks it's most unsatisfactory to do this.
-----------------------------------------------------------------
Billy: This brings up a question I had for the group. In the early '60's,
hex was not very popular. And it hadn't become a standard to use A-F. I
worked on one hex machine that used lower case i,j,k,l,m,n and another that
used upper case U, V, W, X, Y, and Z.
Does anyone remember using any other notations? There must have been many
more.
------------------------------------------------------------------
Tony Duell wrote:
Word lengths tend not to be multiples of 3 bits. This means, for example you
can't easily split 16 bit word into bytes, or combine 2 bytes into a word
when you write them in Octal. In hex it's trivial. There were various
split octal notations where you convert each part separately, but they
get confusing fast.
But basically it's just another way of writing numbers which is useful
sometiimes (particularly if 'you're misisng 2 fingers' as Tom Lehrer put it.
-tony
----------------------------------------------------------------------------
--------------------------------------------
Billy:
I took a C class many years back where the instructor started out with the
statement that all computers use word sizes that are multiples of 8 bits. I
couldn't help laughing. After class, I explained to her why and described
the G-15, RPC-4000, etc. I feel a little that way now on the discussion of
octal - how soon we forget.
I want to mention to Tony that I've worked on computers for 40+ years that
were multiples of 3. And at one time, they were the biggest and the fastest
in the world. The CDC 1604, and 3400, 3600, and 3800 were 48 bits. The
924, 3100, 3200, 3300, and 3500 were 24 bits. The 140, 160, 160-A and 8090
were 12 bits. (The 160-G was 13 bits, but every family has one. Besides it
was still an octal machine.)
And of course, all the 6600 and 7600 machines were 60 bits. For 20 years,
these machines dominated the large computer market place.
All of these systems were octal oriented. Hex was never brought up in
polite company.
Then there's the DEC systems of 12 and 18 bits. And many others (3x bit)
among the "7 Dwarves".
Octal was widely used for the first two generations of computing. Hex is a
Johnny-come-lately, with the prime populizar being the IBM 360.
I've often wondered why dentists didn't use octal/hex to number teeth? 32
for a complete set, divided into 4 quadrants of 8.
Billy