On 3 Jul 2012 at 11:24, Sean Conner wrote:
char 8 bits or larger [1]
short 16 bits or larger
long 32 bits or larger
An int can't be shorter than a short, and can't be longer than a
long.
-spc (At least we get signed/unsigned integers in C; in Java, no
such
luck)
The root of the problem is that C (and associated programming
language) uses "int" as a hold-all for things not integer in nature.
Think about it--doing operations such as shifting and using logical
operators on an integer isn't arithmetic in nature--it's binary
logical.
Adding, subtracting, multiplying, dividing and comparing integers
makes arithmetic sense. ANDing, ORing, EXORing, shifting, NOTing
does not--they're operations reserved for strings of bits. But
that's been a weakness of C since K&R put pen to paper.
My fear is that C and other languages are constraining our ideas of
computer architecture unduly.
--Chuck