From: Neil Thompson <albiorix at gmail.com>
I'm convinced that Dijksta (and anyone else who came out with similar
comments were full of horseshit. In my opinion, it's the ability to
translate a real world "thing" into an algorithm that is the essense of
programming, and anyone who has managed to learn (particularly on their
own, as many of us did) that ability has learned something that transcends
the language (or tool) you use to implement the algorithm.
There's definitely truth to this. The main thing that makes a good
programmer isn't memorization of language features or syntax, it's
good mental organization and thinking habits; the ability and practice
of really *thinking through* the steps involved in solving a problem,
building a solid mental model of the relevant data structures and
algorithms, and then breaking those down into component steps until
one arrives at a suitable representation in native-language
operations. If someone has a good understanding of that, they can
apply it (with varying amounts of blood, sweat, and tears) in any
language; if they don't, there's no language in the world that can
impart it to them (no matter *what* the flavor-of-the-decade Savior Of
All Programming Forever is - "Try Swift! It's the new Pascal!")
*That said,* there are definitely some languages that are more
conducive to building these habits than others (and, within each
group, many that emphasize different aspects more or less strongly.) I
can't speak to COBOL as I've never had cause to get any experience
with it, but I would say that BASIC (as in, the old-school,
unstructured BASICs of the Bad Old Days) really does teach you a bunch
of habits that you end up needing to un-learn as soon as you start
working with better languages (not even *newer* languages - ALGOL and
Lisp both predate it.)
Line-#-and-GOTO programming imposes the same burden of bookkeeping and
space-management on the programmer as direct machine-code monitor
hacking and the most primitive assemblers, but without any rational
explanation as to why, so that any novice attempting to create a
program of any real complexity ends up being instilled with a
superstitious dread over the ludicrous non-question of where to put
things - do I space statements N numbers apart? What if I need to add
more than N-1 intervening statements later!? Should I place my
subroutines on even 1000s for easy reference? Will the line numbers
even go high enough!? - the lack of scoped/local variables or any
parameter-passing mechanism for GOSUB makes any non-trivial
modularization nearly impossible, and the READ/DATA structure is just
flat-out demented.
And all that mental exhaustion *before* the newbie even gets to the
*real* challenges of learning to program!
Now, Dijkstra was a self-important ponce given to wild all-or-nothing
proclamations and manifestos (manifestes? Manifesti?) and even if we
give him the benefit of the doubt and assume that his statements
quoted here were meant tongue-in-cheek they're still pretty
ridiculous. And God knows the Appointed Language Messiah in that great
holy war, Pascal, was its own special breed of Hell for novices and
experts alike (array size as type qualifier? Just kill me now...) And
it's definitely true that plenty of people can and did learn to
program in BASIC and still went on to learn better and do Good Things
down the line. But there absolutely are such things as bad programming
languages.