On 2021-09-28 11:43 a.m., Vincent Long standing via cctalk wrote:
The C standards are more liberal, and continue to
require char types to
be 8 or more bits.
Was PL/I the only language that would let you select data size
variables? Of course the fine print would not let you have more than 16
decimal digits, or 32 bit binary. You would think by now that a language
could handle any length data.
In principle then, char could still be 9 or more bits,
but if int8_t is
required for a *NIX, who would do it?? They'd just be making a
compatibility? mess for themselves.
Is it not a mess already as hardware keeps changing standards.
Look at USB or all the kinds of network protocols.
It's also unclear to me whether one's
complement representation would be
allowed.? (Various other representations would probably be prohibited by
My next computer will be 44 bits, if I ever get the routing timing bugs
out the FPGA
prototype card. I can't change the FPGA vender because I can use TTL
macros like 74181, for TTL bread boarding.
With the 74181 I can have any width I want, thus I can play with odd sizes.
More I play with my designs, I come to the conclusion that
32 bits is not ample for a general purpose computer.