On 01/06/2019 02:08 PM, Grant Taylor via cctalk wrote:
On 1/6/19 11:25 AM, Guy Sotomayor Jr via cctalk
wrote:
I think it?s also telling that the IETF uses the
term octet in all of
the specifications to refer to 8-bit sized data.? As ?byte? (from
older machines) could be anything and is thus somewhat ambiguous.
It *may* have been the IBM 360 that started the trend of Byte ==
8-bits as the 360?s memory (in IBM?s terms) was byte addressable and
the instructions for accessing them were ?byte? instructions (as
opposed to half-word and word instructions).
Yes it was.
Machines around them and in that time frame (mainframe) were 12, 18, 36,
60 bit words.
The big break was mid 1970s with micros first 8008, 8080, 6800 and
bigger machines
like PDP11 (did byte word reads and writes) and TI990.
The emergence of VAX and other 32bit machines made 8bit common as
terminal IO was
starting to standardize.
Thank you for the clarification.
My take away is that before some nebulous point in time (circa IBM's
360) a "byte" could be a number of different bits, depending on the
computer being discussed.? Conversely, after said nebulous point in
time a byte was standardized on 8-bits.
Is that fair and accurate enough?? -? I'm wanting to validate the
patch before I apply it to my mental model of things.? ;-
There is no hard before and after as systems like DEC10 and other
persisted for a while.? Also part of it was IO codes for the
EBDIC, Flexowriter, ASr33 (8level vs Baudot), and CRT terminals emerging
with mostly IBM or ANSI.
I am somewhat DEC and personal computer (pre IBM PC) centric on this as
they were he machines I got to see
and work with that were not in rooms with glass and white coated
specialists.
Allison