On 2025-01-13 3:10 p.m., Martin Bishop via cctalk wrote:
Ben
I shall simply comment that for a useless language it did a lot of good, signal
processing, work for me all but 50 years ago
I never said it was bad, just kept being delayed by politics.
As an example, you could define matrix operators with
parametric dimensions and write complex matrix (Riccati) equations "naturally".
The code was easy to verify, if not efficient on sparse matrices. But, they provided a
gold standard to commission the efficient code and validate its computations.
Even now "languages" that can return dynamically dimensioned results on the
heap are a bit thin on the ground, and Algol68R took care of the garbage collection and
had better fault dumps than I have seen since.
But all this stuff still is the old 8,16,32+ bit data sizes.
Only ALGOL for the DG NOVA computers ( that I know of ) let one define
variable sized data. You would think by now one could define real
abstract data sizes and types. unsigned int: 4091 bits foobar, decimal
float 48,2 big-money.
And, the syntax was defined in BNF - just like Ada and
VHDL
More standards I don't like.
As code monkey, give a few bananas for a page machine code and I am
happy. All this modern stuff is too complex to for me figure out what is
really compiled.
I don't agree with computer science in that most things are too ABSTRACT
like parsing a program. Take the ; it is really the end of a statement,
none of this end of line stuff, simply because blanks are stripped
and you can't delimit tokens otherwise.
I have yet to see a recursive operating system, other than LISP in terms
of atomic objects, something not seen in computer science books.
Nor have I been able to afford a ALGOL programming book.
Who needs BNF, we have FORTRAN II :)
Martin