On Fri, 29 Jan 2021, Chuck Guzis via cctalk wrote:
Well, part of the confusion lies in the difference of
"=" in mathematics
indicating a property or state, as opposed to computer languages using
it as an operator. It's a subtle distinction, but important.
D = 4AC in mathematics establishes a property of D, whereas
D = 4*A*C in BASIC, etc. means "multiply 4 by A, then take that result
and multiply it by the value of C and store the result into D.
APL treats the assignment as what it is--an operation. Why the RTL of
APL was chosen by Iverson, is a mystery; I agree. He was, as far as I
know, not native writer of Hebrew. I suppose we should be grateful
that he didn't specify APL as a boustrophedon.
We already have Forth.
Therein lies one of the largest problems for first exposure to
programming.
NO! We are NOT talking about "first exposure" as meaning a few years; we
are talking about the first few DAYS (for most students, but up to
weeks/months for some). By the time that you got HERE, you had forgotten
what you struggled with on your first day(s).
They have been brought up with enough math that they "understand" that
X = 3
means that X has a value of 3.
But why not 3 = X ?
As Chuck said, it is the fundamental difference between equality
(property) and assignment (operation).
To a programming language, 3 = X is a request to take the current value of
X, and assign that as the NEW value of the "constant" 3. And the compiler
won't cooperate!
Early versions of BASIC had a keyword "LET". LET X = 3 is devoid of
most of the ambiguity, and LET 3 = X is much less likely to be attempted.
'Course, changing the values of constants opens up some strange
possibilities!
A few other languages have used other different symbols for assignment, to
avoid the confusion of "equality"
Without OTHER changes in parsing arithmetic expressions, that may or may
not be warranted, just replacing the '=' being used for assignment with an
arrow ELIMINATED that particular confusion. Well, mostly. You can't use
a right pointing arrow to fix 3 = X
I spent years introducing students to their first contact with computers.
MY first was FORTRAN. My father handled the statistics for the "CBS
National Driver's Test". IBM provided the computers, the port-a-punch
cards for mail-in responses (printed with a box for where to put a
stamp!), and the programming. One of the early percentages in the LIVE
show didn't add up to 100! If you look over Walter Cronkite's shoulder,
you can see my father frantically manually recalculating the numbers.
The next morning, my father had the dining room table covered with
manuals, and we started to try to learn FORTRAN. IIRC, over a short time,
we settled on the books by McCracken and Decima Anderson.
BTW, I consider BASIC to be very good for "FIRST EXPOSURE", IFF it is
limited to that, and other languages are soon introduced.
I don't agree with Dijkstra:
"It is practically impossible to teach good programming to students that
have had a prior exposure to BASIC: as potential programmers they are
mentally mutilated beyond hope of regeneration." -Edsger Dijkstra
It takes a substantial OVER-exposure to cause the damage.
--
Grumpy Ol' Fred cisin at
xenosoft.com