On Tue, Sep 30, 2014 at 1:50 AM, Eric Smith <spacewar at gmail.com> wrote:
C requires that all data types have sizes of
a multiple of the size of the character type, and the character type
is required to have a range of at least -127 to +127 (for signed char)
or 0 to 255 for unsigned char.
I forgot to mention that there also is a requirement that relates all
of the integer types to powers of two, so for instance having an
unsigned character type implemented as 3 BCD digits with a range of
000 to 999 is also a non-starter. I would have to be limited to 000
to 255 or 000 to 511, with the other possible machine representations
becing known as "trap representations".
Note that tonly the number of non-trap representations is restricted.
While the source-level integer type semantics have to be 0..2^n-1 (for
unsigned), or the near-equivalent signed range, the actual
representation used for the values could be anything. For instance,
the value 0 could be represented as the sequence of BCD digits 3, 9, 2
if you really wanted to do that, as long as all of the arithmetic,
conversion, etc, consistently produced correct results that are not
trap representations. It's allowed to produce a trap representation
for signed overflow. The behavior of trying actually read an lvalue
with a trap representation is undefined, so it might cause the machine
to catch fire or demons to fly out of your nose, yet still be
standard-conformant.