ISTR that current versions of the [C] standard assume
a binary
machine that provides particular word width, but earlier versions
give much more flexibility.
Current C does require a binary machine - there are various constraints
which collectively require that everything be made up of bits - but I
think the only word width constraints are minimums. (In practice, of
course, trying to run with, say, 9-bit chars, 18-bit ints, and 36-bit
longs is likely to break a Whole Lot of software.)
I've occasionally toyed with the idea of building a `code checkout'
compiler, one which goes out of its way to break various assumptions
which are not promised by the language but are nevertheless true of the
obvious implementation on all even vaguely common machines. (For
example, void * and char * would be larger than, and use a different
representation from, other pointer types.)
> Which brings up the question "Do you design a
machine to run a
> language or design a language to run on a machine?" That appears to
> have changed over the last 30-40 years. Are we poorer for that?
Not for that per se. But I think we _are_ poorer for the monocultures
it's led to. (I consider pretty much _any_ monoculture a bad thing.)
/~\ The ASCII Mouse
\ / Ribbon Campaign
X Against HTML mouse at
rodents-montreal.org
/ \ Email! 7D C8 61 52 5D E7 2D 39 4E F1 31 3E E8 B3 27 4B