n Wed, 3 Dec 2014, Toby Thain wrote:
Sure, there are some niches where it's still
appropriate.
My point is that it's used widely in places where it's just not ok, for
no better reason than "we've always done things this way"*
Very true.
A lot of the problem [,if there is one] is "baby duck syndrome".
For example, students will stick with what they learned in school -
BASIC was developed for TEACHING programming; as such, it was a fairly
good choice for being built-in to generations of personal computers.
When Pascal was developed for TEACHING programming, we got a whole
generation who had learned on it, were never told, or never
accepted, that it was intended for TEACHING, not for real world
applications, and proceeded to use it when they got out into the real
world, with mixed results.
When C was the language that was taught, we got generations of C
programmers.
Now, will we be getting generations of SCHEME programmers?
A well designed language (and compiler) will excel at what it was designed
for, and is not likely to be the best possible choice for other purposes.
Nevertheless, these languages are generally capable of doing tasks outside
their specialty. And, if somebody's experience is with some specific
tool, will forcing a change to other tools because of their potential
better suitability be worth the setback in experience?
If you employed Tony, and used state of the art diagnostic systems,
would you ask him to give up his voltmeter and logic probe?
Some things, such as runtime error checking are great for a teaching
language, but lose some of their benefit in the real world.
For example:
Y = 100 / X
could be disastrous if X == 0
Should the compiler add error checking?
IF (X == 0) ...
ELSE Y = 100 / X
But, if you KNOW that X is not 0
IF (condition) X = 2, else X = 5
Y = 100 / X
do we really want the bloat and slowdown of checking whether X is 0?
Do we expect the COMPILER to have kept track of what possible values X
might have at that time? (possible in this trivial case, not in anything
much more complex) WE might know what X was used for, and therefore its
range of possibilities, the compiler wasn't a party to all of the design
conferences. The programmer still knows more about what is intended than
the compiler does.
If you DON'T know that X is non-zero, then YOU should insert appropriate
tests. But, too many programmers will write Y = 100 / X without including
the essential safety test. The fault is with the programmer, not the
language.
Similarly,
F = C * (9/5) + 32 (note that 9 and 5 are integers!)
or
IF(((float)X/3.0) * 3.0 == X) . . .
or
(int32)X<<32
might be programmer errors. It would be NICE if the compiler would notice
the mistake and WARN us, but it is not the fault of the compiler if we
make the mistake, or even if the compiler author didn't include code to
check for our particular mistake.
The task of the compiler is to do what we ASKED for;
secondarily, it would be nice if it does what we WANT.
(cf. old yiddish curses)
Likewise,
X << N would be treated as X << (N & 31) by 80386
X << 32 would be treated as X << 0 by 80386
It is still the programmer's responsibility to learn all such
DOCUMENTED characteristics, and even the UNDOCUMENTED ones that
have already been encountered.
* - 1973 is now outside living memory for a lot of
developers.
Oh, I remember it well.
Aerospace had just collapsed. In some areas,
auto garages were full of EEs working as mechanics.
In 1973, I declared that I would get back into computers when
"tabletop computers" became readily available, and I opened an
auto repair shop working on Honda cars.
For a while, my employees chided me about being the only one
who hadn't been through graduate school, and only had a BA.
I was fascinated by S100,
but the coming of TRS80, Pet, and Apple brought me back.
Yes, "a lot of". Way too many of my friends have died.
--
Grumpy Ol' Fred cisin at
xenosoft.com