Ensor wrote:
<desperately....holding....back....from....commenting.... - oh, stuff it>
Yes, and those are PRECISELY the sort of people who have no business
being computer programmers, IMHO; I.E. they couldn't program their way
out of a wet paper bag.
This sounds like a variation on the language religious war.
But, (and in keeping the post on topic), I submit that writing ASM or ML
on 32 or 64 bit processors like x86 and PowerPC (or even the later 68K
offerings) would be inappropriate for many young children as a first
language.
An 8 or 16 bit CPU offers less to learn (6502 has 56 opcodes, I think)
and there are less things to consider. No multiple cores, no
re-ordering instructions to efficiently fill the pipeline, no
multi-tasking OS stealing cycles from your app, etc.
As well, in keeping with the "if it's too easy, it hides too many
details" view, rolling your own multiply or divide on a CPU without a
native implementation in silicon is a very useful exercise. An initial
subtraction based divide routine nicely ties the CPU back to the concept
of long division from school. Working through ways to improve the
divide by doing bit shifting and/or lookup tables brings the algorithm
to the forefront, which I think is the real value.
So, I think it's more than ML, it's ML on a small enough CPU to make the
learning curve reasonable and to force one to learn the algorithm.
All that said, I'm not sure I agree ML is the best for a budding
programmer. You can use ML and still be clueless. Though, if it works
out and you choose an older platform, that's one less older unit that is
no longer obsolete.
Jim