It was thus said that the Great Liam Proven once stated:
On 26 April 2016 at 16:41, Liam Proven <lproven at
gmail.com> wrote:
When I was playing with home micros (mainly Sinclair and Amstrad; the
American stuff was just too expensive for Brits in the early-to-mid
1980s), the culture was that Real Men programmed in assembler and the
main battle was Z80 versus 6502, with a few weirdos saying that 6809
was better than either. BASIC was the language for beginners, and a
few weirdos maintained that Forth was better.
The 6908 *is* better than either the Z80 or the 6502 (yes, I'm one of
*those* 8-)
So now, it's Unix except for the single remaining
mainstream
proprietary system: Windows. Unix today means Linux, while the
weirdoes use FreeBSD. Everything else seems to be more or less a
rounding error.
There are still VxWorks and QNX in embedded systems (I think both are now
flying through space on various probes) so it's not quite a monoculture.
But yes, the desktop does have severe moncultures.
C always was like carrying water in a sieve, so now,
we have multiple
C derivatives, trying to patch the holes.
Citation needed. C derivatives? The only one I'm aware of is C++ and
that's a far way from C nowadays (and no, using curly braces does not make
something a C derivative).
C++ has grown up but it's
like Ada now: so huge that nobody understands it all, but actually, a
fairly usable tool.
There's the kinda-sorta FOSS "safe C++ in a VM", Java. The proprietary
kinda-sorta "safe C++ in a VM", C#. There's the not-remotely-safe
kinda-sorta C in a web browser, Javascript.
They may be implemented in C, but they're all a far cry from C (unless
youmean they're imperative languages, then yes, they're "like" C in
that
reguard).
And dozens of others, of course.
Rust is now written in Rust. Go is now written in Go. Same with D.
There are modern alternatives to C. And if the community is anything to go
by, there is a slowly growing contigent of programmers that would outlaw the
use of C (punishable by death).
So they still have C like holes and there are frequent
patches and
updates to try to make them able to retain some water for a short
time, while the "cyber criminals" make hundreds of millions.
I seriously think outlawing C will not fix the problems, but I think I'm
in the minority on that feeling.
Anything else is "uncommercial" or "not
viable for real world use".
Borland totally dropped the ball and lost a nice little earner in
Delphi, but it continues as Free Pascal and so on.
Apple goes its own way, but has forgotten the truly innovative
projects it had pre-NeXT, such as Dylan.
There were real projects that were actually used for real work, like
Oberon the OS, written in Oberon the language. Real pioneering work in
UIs, such as Jef Raskin's machines, the original Mac and Canon Cat --
forgotten. People rhapsodise over the Amiga and forget that the
planned OS, CAOS, to be as radical as the hardware, never made it out
of the lab. Same, on a smaller scale, with the Acorn Archimedes.
While the Canon Cat was innovative, perhaps it was too early. We were
still in the era of general purpose computers and the idea of an
"information appliance" was still in its infancy and perhaps, not an idea
people were willing to deal with at the time. Also, how easy was it to get
data *out* of the Canon Cat? (now that I think about it---it came with a
disk drive, so in theory, possible) You could word process, do some
calculations, simple programming ... but no Solitare.
As for CAOS, I haven't heard of it (and yes, I did the Amiga thing in the
early 90s). What was unique about it? And as much as I loved the Amiga,
the GUI API (at least 1.x version) was very tied to the hardware and the OS
was very much uniprocessor in design.
Despite that, of course, Lisp never went away. People
still use it,
but they keep their heads down and get on with it.
Much the same applies to Smalltalk. Still there, still in use, still
making real money and doing real work, but forgotten all the same.
The Lisp Machines and Smalltalk boxes lost the workstation war. Unix
won, and as history is written by the victors, now the alternatives
are forgotten or dismissed as weird kooky toys of no serious merit.
The senior Apple people didn't understand the essence of what they saw
at PARC: they only saw the chrome.
To be fair, *everybody* missed the essence of what they did at PARC; even
Alan Kay wasn't of much help ("I meant message passing, *not* objects!"
"Then why didn't you say so earlier?" "And Smalltalk was *never* meant
to be
standardized! It was meant to be replaced with something else every six
months!" "Uh, Alan, that's *not* how industry works.").
The problem with the Lisp machines (from what I can see) is not that they
were bad, but they were too expensive and the cheap Unix workstations
overtook them in performance. Had Symbolics and LMI moved their software to
commodity hardware they might have survived (in this, Bill Gates was
right---it was software, not hardware, where the money was).
Smalltalk has other issues. In the 80s, there were not many machines
capable of running Smalltalk (I'm not aware of any implementation on micros,
serious or not) and by the time workstations where becoming cheap enough,
the Smalltalk vendors were charging rediculous amounts of money for
Smalltalk. The other issue is the image---how do you "distribute" software
written in Smalltalk? Smalltalk is the application is Smalltalk, so you
have to buy into the whole ecosystem (perhaps in the 90s---today, probably
less so).
They copied the chrome, not the
essence, and now all that *any* of us have is the chrome. We have
GUIs, but on top of the nasty kludgy hacks of C and the like. A
late-'60s skunkware project now runs the world, and the real serious
research efforts to make something better, both before and after, are
forgotten historical footnotes.
The essence being (LISP|Smalltalk) all the way from top to bottom?
Modern computers are a vast disappointment to me. We
have no thinking
machines. The Fifth Generation, Lisp, all that -- gone.
What did we get instead?
Like dinosaurs, the expensive high-end machines of the '70s and '80s
didn't evolve into their successors. They were just replaced. First
little cheapo 8-bits, not real or serious at all, although they were
cheap and people did serious stuff with them because it's all they
could afford. The early 8-bits ran semi-serious OSes such as CP/M, but
when their descendants sold a thousand times more, those descendants
weren't running descendants of that OS -- no, it and its creator died.
I beg to differ. MS-DOS *was* CP/M (or rather, CP/M on the 8086). I've
studied CP/M and have had to implement a portion of MS-DOS for a one-off
project of mine [1] that used the CP/M era system calls of MS-DOS (MS-DOS
later got more Unix like system calls, which are actually nicer to use than
the CP/M versions).
CP/M evolved into a multiuser multitasking 386 OS that
could run
multiple MS-DOS apps on terminals, but it died.
No, then the cheapo 8-bits thrived in the form of an 8/16-bit hybrid,
the 8086 and 8088, and a cheapo knock-off of CP/M.
This got a redesign into something grown-up: OS/2.
Predictably, that died.
Because Microsoft and IBM had different goals. Microsoft wanted a 32-bit
version initially; IBM did not want to eat into their mini and main frame
business and IBM had the clout to push their agenda and thus, OS/2 1.x for
the 80286. The partnership soured in 1990, Microsoft went with NT and IBM
took over OS/2 but didn't have the marketing clout (or savviness) if
Microsoft.
So the hacked-together GUI for DOS got re-invigorated
with an
injection of OS/2 code, as Windows 3. That took over the world.
The rivals - the Amiga, ST, etc? 680x0 chips, lots of flat memory,
whizzy graphics and sound? All dead.
The 68k series dies because Motorola could not compete with Intel.
Morotola did more than just chips (radios, cell phones) whereas Intel did
nothing *but* chips, and with Microsoft pushing computers into every home (a
definite goal of Bill Gates, but of course all them running Microsoft
software) gave Intel a significant bankroll to make the x86 line performant
(and no one really wanted the Intel RISC chips; at least, not to the degree
to make continued development profitable).
Then Windows got re-invented with some OS/2 3 ideas
and code, and some
from VMS, and we got Windows NT.
But the marketing men got to it and ruined its security and elegance,
to produce the lipstick-and-high-heels Windows XP. That version,
insecure and flakey with its terrible bodged-in browser, that, of
course, was the one that sold.
?Consistent mediocrity, delivered on a large scale, is much more
profitable than anything on a small scale, no matter how efficient
it might be.?
Linux got nowhere until it copied the XP model. The
days of small
programs, everything's a text file, etc. -- all forgotten. Nope,
lumbering GUI apps, CORBA and RPC and other weird plumbing, huge
complex systems, but it looks and works kinda like Windows and a Mac
now so it looks like them and people use it.
COBRA was dead by the mid-90s and had nothing (that I know of) to do with
Linux. And the lumbering GUI apps, RPC, etc that you are complaining about
is the userland stuff---nothing to do with the Linux kernel (okay, perhaps
I'm nitpicking here).
Android looks kinda like iOS and people use it in
their billions.
Newton? Forgotten. No, people have Unix in their pocket, only it's a
bloated successor of Unix.
The efforts to fix and improve Unix -- Plan 9, Inferno -- forgotten. A
proprietary microkernel Unix-like OS for phones -- Blackberry 10,
based on QNX -- not Androidy enough, and bombed.
We have less and less choice, made from worse parts on worse
foundations -- but it's colourful and shiny and the world loves it.
Look at any industry and the pattern repeats.
That makes me despair.
We have poor-quality tools, built on poorly-designed OSes, running on
poorly-designed chips. Occasionally, fragments of older better ways,
such as functional-programming tools, or Lisp-based development
environments, are layered on top of them, but while they're useful in
their way, they can't fix the real problems underneath.
Occasionally someone comes along and points this out and shows a
better way -- such as Curtis Yarvin's Urbit.
I'm still not convinced Curtis isn't trolling with Urbit. Like Alan Kay,
he's not saying anything, expecting us to figure out what he means (and then
yell at us for failing to successfully read his mind).
-spc (Have dealt with my share of "geniuses" yelling at me for failing to
read their mind ... )
[1]
https://github.com/spc476/NaNoGenMo-2015
https://github.com/dariusk/NaNoGenMo-2015/issues/184