Linux and the 'clssic' computing world

Paul Koning paulkoning at
Tue Sep 28 15:24:38 CDT 2021

> On Sep 28, 2021, at 3:15 PM, ben via cctalk <cctalk at> wrote:
> On 2021-09-28 11:43 a.m., Vincent Long standing via cctalk wrote:
>> The C standards are more liberal, and continue to require char types to be 8 or more bits.
> Was PL/I the only language that would let you select data size for variables? Of course the fine print would not let you have more than 16
> decimal digits, or 32 bit binary. You would think by now that a language
> could handle any length data.

Python does, its "int" type handles numbers of any size, so long as it fits in memory.  If you create sufficiently large numbers it may take a while to do arithmetic operations, of course.

>> In principle then, char could still be 9 or more bits, but if int8_t is required for a *NIX, who would do it?  They'd just be making a compatibility  mess for themselves.
> Is it not a mess already as hardware keeps changing standards.
> Look at USB or all the kinds of network protocols.

"The nice thing about standards is that there are so many to choose from" -- Prof. David Clark, MIT.

>> It's also unclear to me whether one's complement representation would be allowed.  (Various other representations would probably be prohibited by other restrictions.)

C certainly has been run on one's complement machines (CDC 6000 series) but I've heard it stated by some who would know that the current standard expects two's complement.  Interestingly enough, floating point data is, in IEEE and a number of other formats, sign/magnitude encoded, not two's complement.   Some old machines used one's complement for float, though.

> My next computer will be 44 bits, if I ever get the routing timing bugs out the FPGA
> prototype card. I can't change the FPGA vender because I can use TTL macros like 74181, for TTL bread boarding.
> With the 74181 I can have any width I want, thus I can play with odd sizes.

For extra credit, create a GCC back end for that design.  :-)

> More I play with my designs, I come to the conclusion that
> 32 bits is not ample for a general purpose computer.

I think Von Neumann would agree; he picked 40 bits as I recall.  There are all sorts of interesting word lengths out there; the strangest I've worked with is 27 bits one's complement.


More information about the cctalk mailing list