On Thu, Jan 14, 2021 at 07:43:13PM -0800, Chuck Guzis via cctalk wrote:
APL was difficult for those used to traditional
programming languages, not
primarily because of the character set, but because it's basically a
vector/matrix programming language.
It is *also* the use of symbols. Firstly, some people are just symbol-blind
and prefer stuff spelled out in words. It's just how brains are wired.
Secondly, beyond BODMAS, the meaning and precedence of random symbols is
unclear to casual readers. Calls to named functions tend to be more
descriptive -- "map" and "filter" mean the same sort of thing across
functional(-inspired) languages -- and the precedence is obvious thanks to
At a guess, part of the reason APL does this is so that the programmer
pre-tokenises the input to make it easier for the language to process.
Sinclair BASIC did this too, to much wailing and gnashing of teeth. It may
have even been inspired to do this by APL given the manual says Sinclair
BASIC was written by a "Cambridge mathematician".
The "modern" vector/matrix programming language most commonly used by
contemporary programmers use is probably SQL. It's amazing how many people
use it inefficiently as a key-value store when it's really a
matrix-transformation language even though it *looks* like an imperative
language. The 1970s has a lot to answer for.
It's a different world from BASIC, for sure.
Yes, well, a lot of BASIC programmers have even more fundamental problems
with understanding important programming concepts, such as recursion and
pointers/references. Without those, a lot of important algorithms cannot be
implemented, or even understood.
Neil maintained that its strength lay in thinking
about things in a
non-scalar way. I'll give him that--programming on STAR, where a scalar
was treated by the hardware as a vector of length 1 (and thus very slow
because of startup overhead) certainly led you toward thinking about
things in vector operations, just like APL.
Modern x86-64 (and ARM etc) also (finally!) has useful vector instructions.
Unfortunately, the popular languages do not make their use terribly simple,
and mostly rely on compilers recognising idiomatic loop patterns over
scalars and transforming them. This works about as well as you might expect.