On 1/9/12 7:55 AM, "Liam Proven" <lproven at gmail.com> wrote:
The thing that you seem unwilling to address is this.
For one language - or one family - to use a non-algebraic notation,
one that is therefore unfamiliar to most of the literate, numerate
human race, i.e. most programmers, is a very major issue. The
advantages of that notation would have to be pretty damned stunning to
make it worth the while, because even Lisp textbooks, of which I have
several, all tend to devote a chapter or two to teaching the notation,
complete with exercises and notes to the effect that "we know this
seems weird, but it gets easier with practice, believe us".
Example:
http://nostoc.stanford.edu/Docs/livetutorials/lispintro.html
Quote:
"This seems complex (or at least verbose!), but when you get used to
it you'll find that it's actually very easy to follow, and incredibly
convenient in many ways. Here are some others to try out. See if you
can figure out what they evaluate to before trying them::"
Why should programmers *have* to get used to it? The existing notation
that everything else uses is not broken. People do not need pages of
exercises to get used to it. They do not have to do exercises in order
to work out what an expression means.
You're close to convincing me that Dijkstra was right. :-)
The sticking point here seems to be that you assume that everything
*should* use algebraic notation, just because "everyone knows it." Such
thinking complicates the use of computers for non-numeric data or
concepts. (Do you use algebraic notation for writing sentences?) The
first electronic computers were intended for numerical processing, and
McCarthy's genius was seeing past that and designing a simple, elegant
programming language that allows expression of logical operations on
symbolic data that did not rely on, for instance, mapping Boolean 'false'
to one numeric value and 'true' to all others (!). Not to be
holier-than-thou: I realize now that my earliest attempts to learn Lisp
were doomed by the fact that I expected to perform numeric calculations,
which is only a subset of what we do with computational devices. I think
Dijkstra was wrong that the 'damage' was irrevocable, but otherwise he was
spot on.
The *job* of a high-level language is to make life easier for human
beings, or we'd all write everything in machine code. A language which
needs people to learn new, difficult patterns and practices because
they are more convenient for the interpreter or compiler, or are
easier for the program itself to manipulate, is a language which is
biased in favour of the computer and the implementer and against the
user, and that is a very profoundly dangerous direction to take. I
might go so far as to say it is a serious flaw.
But here, we are talking about a language that asks people to learn "new
[...] patterns and practices" (which assumes they've already learned other
patterns and practices) because it clarifies underlying ideas and frees
computing from its artificially imposed numerical roots. It's not about
convenience for the machine, but empowerment for the human mind.
As an aside: I'm seeing a related challenge for many in the social
sciences. They have become convinced by the rhetoric of the positivists
that since they don't use numbers they aren't doing science. So, they use
complicated methods to map natural phenomena onto numbers - which is very
similar to what we are often forced to do with algebraically structured
programming languages, unless our problem domain is merely "take these
numbers, massage gently, and produce those numbers." That's fine for
accounting, but often insufficient for understanding the world. If I have
to go through programmatic gyrations to express problems as algebraic
expressions, has the HLL made my life easier? -- Ian