On: raising the semantic level of a program

Chuck Guzis cclist at sydex.com
Sun Jun 28 18:22:16 CDT 2020

On 6/28/20 3:28 PM, ben via cctalk wrote:
> On 6/28/2020 2:32 PM, Chuck Guzis via cctalk wrote:
>> Why is byte-granularity in addressing a necessity?  It's only an issue
>> if you have instructions that operate directly on byte quantities in
>> memory.
> Why have bytes in the first place then? A packed string does count here.
> IBM started this mess with the 360 and 32 bits, and everybody
> followed.Is Fortran I/O the BIBLE on character data.
> IBM never seemed even to follow one character set encoding even with the
> similar machines like the IBM 1130.

There were lots of machines where the granularity was the word.  The CDC
6000/7000/Cyber 60 bit machines, for example.  The lack of byte
addressing did nothing to hold those machines back.  At one point, CDC
had the fastest COBOL implementation, even though it lacked byte
addressing and decimal arithmetic, something that you'd think would give
an edge to the S/360 series.   Chew on that one for awhile.

Punched cards (at least those of the 1130 era) used 12 bit encoding if
column-binary or 36 bit encoding if handling row-binary (e.g. 704; which
is why early programming languages used only the first 72 columns of a

FWIW, I've also worked on bit-granular addressing systems, where bit
arrays were part of the instruction set.


More information about the cctalk mailing list