On Dec 4, 2014, at 3:38 PM, Peter Corlett <abuse at
cabal.org.uk> wrote:
On Thu, Dec 04, 2014 at 02:21:50PM -0800, Guy Sotomayor wrote:
On Dec 4, 2014, at 1:50 PM, Peter Corlett
<abuse at cabal.org.uk> wrote:
[...]
So on
systems where sizeof(int) <= sizeof(int32_t) -- which is everything
that matters
Really? Where have you been? OS X the default has been to compile
for
64-bits in which case sizeof(int) == sizeof(int64_t) since Leopard (10.5) in
2009. The kernel went default 64-bits in Snow Leopard (10.6) in 2010. OS X
on x86 has always supported mixed 32/64 bit applications (as long as the CPU
did) regardless of what the kernel was (a 32-bit kernel could run 64-bit
applications).
Sorry, but you're wrong:
The width of a *pointer* matches the architecture, 32
bits for i386 and 64 for
x86_64, but the width of an *int* remains 32 bits for compatibility and
performance reasons. The 64 bit integer type is called "long long" on both
architectures. Obviously, one should use the typedefs in <inttypes.h> if a
specific width integer is required, even if only to document intent.
Have a look at the SysV ABI at
http://www.x86-64.org/documentation.html for far
too much gory detail and prime pedant material.
Yep. You're right. I was reading it wrong for some reason. ;-)
I almost never code int in my code anyway. I always use the sized version
(and I almost *never* use the signed version...always the unsigned).
TTFN - Guy