On 26 April 2016 at 16:41, Liam Proven <lproven at gmail.com> wrote:
Swift, you have provided a superb example of this
mockery. And now
you've been called on it, you are, in natural human fashion, lashing
out in return.
It's natural, it's human, and it's exactly why we have the stinking
pile of crap that we do today instead of tools that actually work.
I wish to apologise for this. It was unjustified and unfair, and
unjustly ad-hom as well. I am getting slightly better at controlling
my "FLAME ON" moments, but much more work is required. :-(
I was not saying that Mr Griggs here is the reason for any of this --
that's absurd. I did imply it, though, and I shouldn't have. I'm
sorry.
My contention is that a large part of the reason that we have the
crappy computers that we do today -- lowest-common-denominator boxes,
mostly powered by one of the kludgiest and most inelegant CPU
architectures of the last 40 years -- is not technical, nor even
primarily commercial or due to business pressures, but rather, it's
cultural.
When I was playing with home micros (mainly Sinclair and Amstrad; the
American stuff was just too expensive for Brits in the early-to-mid
1980s), the culture was that Real Men programmed in assembler and the
main battle was Z80 versus 6502, with a few weirdos saying that 6809
was better than either. BASIC was the language for beginners, and a
few weirdos maintained that Forth was better.
At university, I used a VAXcluster and learned to program in
Fortran-77. The labs had Acorn BBC Micros in -- solid machines, *the*
best 8-bit BASIC ever, and they could interface both with lab
equipment over IEEE-488 and with generic printers and so on over
Centronics parallel and its RS-432 interface, which could talk to
RS-232 kit.
As I discovered when I moved into the professional field a few years
later (1988), this wasn't that different from the pro stuff. A lot of
apps were written in various BASICs, and in the old era of proprietary
OSes on proprietary kit, for performance, you used assembler.
But a new wave was coming. MS-DOS was already huge and the Mac was
growing strongly. Windows was on v2 and was a toy, but Unix was coming
to mainstream kit, or at least affordable kit. You could run Unix on
PCs (e.g. SCO Xenix), on Macs (A/UX), and my employers had a demo IBM
RT-6150 running AIX 1.
Unix wasn't only the domain (pun intentional) of expensive kit priced
in the tens of thousands.
A new belief started to spread: that if you used C, you could get
near-assembler performance without the pain, and the code could be
ported between machines. DOS and Mac apps started to be written (or
rewritten) in C, and some were even ported to Xenix. In my world,
nobody used stuff like A/UX or AIX, and Xenix was specialised. I was
aware of Coherent as the only "affordable" Unix, but I never saw a
copy or saw it running.
So this second culture of C code running on non-Unix OSes appeared.
Then the OSes started to scramble to catch up with Unix -- first OS/2,
then Windows 3, then the for a decade parallel universe of Windows NT,
until XP became established and Win9x finally died. Meanwhile, Apple
and IBM flailed around, until IBM surrendered, Apple merged with NeXT
and switched to NeXTstep.
Now, Windows is evolving to be more and more Unix-like, with GUI-less
versions, clean(ish) separation between GUI and console apps, a new
rich programmable shell, and so on.
While the Mac is now a Unix box, albeit a weird one.
Commercial Unix continues to wither away. OpenVMS might make a modest
comeback. IBM mainframes seem to be thriving; every other kind of big
iron is now emulated on x86 kit, as far as I can tell. IBM has
successfully killed off several efforts to do this for z Series.
So now, it's Unix except for the single remaining mainstream
proprietary system: Windows. Unix today means Linux, while the
weirdoes use FreeBSD. Everything else seems to be more or less a
rounding error.
C always was like carrying water in a sieve, so now, we have multiple
C derivatives, trying to patch the holes. C++ has grown up but it's
like Ada now: so huge that nobody understands it all, but actually, a
fairly usable tool.
There's the kinda-sorta FOSS "safe C++ in a VM", Java. The proprietary
kinda-sorta "safe C++ in a VM", C#. There's the not-remotely-safe
kinda-sorta C in a web browser, Javascript.
And dozens of others, of course.
Even the safer ones run on a basis of C -- so the lovely cuddly
friendly Python, that everyone loves, has weird C printing semantics
to mess up the heads of beginners.
Perl has abandoned its base, planned to move onto a VM, then the VM
went wrong, and now has a new VM and to general amazement and lack of
interest, Perl 6 is finally here.
All the others are still implemented in C, mostly on a Unix base, like
Ruby, or on a JVM base, like Clojure and Scala.
So they still have C like holes and there are frequent patches and
updates to try to make them able to retain some water for a short
time, while the "cyber criminals" make hundreds of millions.
Anything else is "uncommercial" or "not viable for real world use".
Borland totally dropped the ball and lost a nice little earner in
Delphi, but it continues as Free Pascal and so on.
Apple goes its own way, but has forgotten the truly innovative
projects it had pre-NeXT, such as Dylan.
There were real projects that were actually used for real work, like
Oberon the OS, written in Oberon the language. Real pioneering work in
UIs, such as Jef Raskin's machines, the original Mac and Canon Cat --
forgotten. People rhapsodise over the Amiga and forget that the
planned OS, CAOS, to be as radical as the hardware, never made it out
of the lab. Same, on a smaller scale, with the Acorn Archimedes.
Despite that, of course, Lisp never went away. People still use it,
but they keep their heads down and get on with it.
Much the same applies to Smalltalk. Still there, still in use, still
making real money and doing real work, but forgotten all the same.
The Lisp Machines and Smalltalk boxes lost the workstation war. Unix
won, and as history is written by the victors, now the alternatives
are forgotten or dismissed as weird kooky toys of no serious merit.
The senior Apple people didn't understand the essence of what they saw
at PARC: they only saw the chrome. They copied the chrome, not the
essence, and now all that *any* of us have is the chrome. We have
GUIs, but on top of the nasty kludgy hacks of C and the like. A
late-'60s skunkware project now runs the world, and the real serious
research efforts to make something better, both before and after, are
forgotten historical footnotes.
Modern computers are a vast disappointment to me. We have no thinking
machines. The Fifth Generation, Lisp, all that -- gone.
What did we get instead?
Like dinosaurs, the expensive high-end machines of the '70s and '80s
didn't evolve into their successors. They were just replaced. First
little cheapo 8-bits, not real or serious at all, although they were
cheap and people did serious stuff with them because it's all they
could afford. The early 8-bits ran semi-serious OSes such as CP/M, but
when their descendants sold a thousand times more, those descendants
weren't running descendants of that OS -- no, it and its creator died.
CP/M evolved into a multiuser multitasking 386 OS that could run
multiple MS-DOS apps on terminals, but it died.
No, then the cheapo 8-bits thrived in the form of an 8/16-bit hybrid,
the 8086 and 8088, and a cheapo knock-off of CP/M.
This got a redesign into something grown-up: OS/2.
Predictably, that died.
So the hacked-together GUI for DOS got re-invigorated with an
injection of OS/2 code, as Windows 3. That took over the world.
The rivals - the Amiga, ST, etc? 680x0 chips, lots of flat memory,
whizzy graphics and sound? All dead.
Then Windows got re-invented with some OS/2 3 ideas and code, and some
from VMS, and we got Windows NT.
But the marketing men got to it and ruined its security and elegance,
to produce the lipstick-and-high-heels Windows XP. That version,
insecure and flakey with its terrible bodged-in browser, that, of
course, was the one that sold.
Linux got nowhere until it copied the XP model. The days of small
programs, everything's a text file, etc. -- all forgotten. Nope,
lumbering GUI apps, CORBA and RPC and other weird plumbing, huge
complex systems, but it looks and works kinda like Windows and a Mac
now so it looks like them and people use it.
Android looks kinda like iOS and people use it in their billions.
Newton? Forgotten. No, people have Unix in their pocket, only it's a
bloated successor of Unix.
The efforts to fix and improve Unix -- Plan 9, Inferno -- forgotten. A
proprietary microkernel Unix-like OS for phones -- Blackberry 10,
based on QNX -- not Androidy enough, and bombed.
We have less and less choice, made from worse parts on worse
foundations -- but it's colourful and shiny and the world loves it.
That makes me despair.
We have poor-quality tools, built on poorly-designed OSes, running on
poorly-designed chips. Occasionally, fragments of older better ways,
such as functional-programming tools, or Lisp-based development
environments, are layered on top of them, but while they're useful in
their way, they can't fix the real problems underneath.
Occasionally someone comes along and points this out and shows a
better way -- such as Curtis Yarvin's Urbit. Lisp Machines re-imagined
for the 21st century, based on top of modern machines. But nobody gets
it, and its programmer has some unpleasant and unpalatable ideas, so
it's doomed.
And the kids who grew up after C won the battle deride the former
glories, the near-forgotten brilliance that we have lost.
And it makes me want to cry sometimes, and I lash out in turn.
I apologise unreservedly for my intemperance. I just wanted to try to
explain why I did it.
--
Liam Proven ? Profile:
http://lproven.livejournal.com/profile
Email: lproven at cix.co.uk ? GMail/G+/Twitter/Flickr/Facebook: lproven
MSN: lproven at
hotmail.com ? Skype/AIM/Yahoo/LinkedIn: liamproven
Cell/Mobiles: +44 7939-087884 (UK) ? +420 702 829 053 (?R)