On 02/19/2015 06:24 AM, Peter Corlett wrote:
C will run on all sorts of bizarre machines, but
somebody has to bother to
implement it, and if the architecture is weird enough that the language has to
be contorted in unexpected ways, it will break assumptions made in typical C
code. ISTR that current versions of the standard assume a binary machine that
provides particular word width, but earlier versions give much more
flexibility.
Do let me know when you've got C for an IBM 1620. SIMH has a pretty
good emulator for that machine.
That modern compilers don't support obsolete
machines isn't a surprise. I can't
find a decent modern C compiler that targets m68k, for example, even though
that architecture is still just about clinging on to life.
Again, one needs to ask "why are they considered obsolete now and not
then?" For example, if IBM could have simplified the 7000-series
machines to a single 7090-type architecture, they could have saved money
by not implementing the 7070, 7080, etc.
C is a great high-level assembly language for a certain class of
architectures, I will admit.
The problem with C (and to a lesser extent C++) is the lack of typing by
usage. Does an int hold a character, boolean value, index, bit sequence
or what? You can alleviate this to some extent with typedefs, but that
doesn't seem to be all that prevalent. Indeed, one indicator of that
problem is the "nUxi" problem when early developers were porting that
particular OS code.
C does well with character addressing, particularly if a word/int is an
integral multiple of characters in length. But not so well with
bit-addressing, even though bit-addressable architectures can be very
useful (as in vector machines).
But if you think you can work out a C for the 1620, please have at it.
Be careful with numeric blanks and record marks...
--Chuck