On 07/02/2012 08:09 PM, Dave wrote:
> On every platform I work on, and indeed every
platform I've
> *ever* worked on, unsigned short has been 16 bits, and
unsigned int
has been
32.
I own one where int is 16 bits, have worked on a compiler for one
where int is 24 bits, and have used one where short is 32
bits and int
64. There is reason for the Tenth Commandment.
Yes, so have I. But I'm a commercial embedded systems
developer (one who generally doesn't use DSPs!), and I don't
work on any of "those" platforms. Literally EVERYTHING I USE
has 32-bit ints and 16-bit shorts, and that has been the case
for upwards of thirty years. From my point of view here, I
see absolutely no reason to add a level of indirection to the
native compiler data type when it gives me (ME) no benefit.
I think thats very true for your environment, but most mortals are woking in
areas where 64-but computing is becoming prevelant and we may get caught,
although at present things seem to have been done sensibly...
Hmm. My two development platforms are x86_64 and UltraSPARC-III+,
both of which are 64-bit. An int is 32 bits, and a short is 16 bits on
both, by default.
My targets are most commonly AVR and ARM7. Ints there are 32 bits and
shorts are 16 bits as well.
This has got
to be the most corner-case-obsessed group of
people I have ever met. ;) (don't get me wrong, I find it fun!)
I think so long as you understand the risks its OK. Again many don't
And here's the crux of the matter. I agree 100%.
-Dave
--
Dave McGuire, AK4HZ
New Kensington, PA