strangest systems I've sent email from

Liam Proven lproven at gmail.com
Thu Apr 28 08:59:53 CDT 2016


On 27 April 2016 at 22:13, Sean Conner <spc at conman.org> wrote:
> It was thus said that the Great Liam Proven once stated:
>> On 26 April 2016 at 16:41, Liam Proven <lproven at gmail.com> wrote:
>>
>> When I was playing with home micros (mainly Sinclair and Amstrad; the
>> American stuff was just too expensive for Brits in the early-to-mid
>> 1980s), the culture was that Real Men programmed in assembler and the
>> main battle was Z80 versus 6502, with a few weirdos saying that 6809
>> was better than either. BASIC was the language for beginners, and a
>> few weirdos maintained that Forth was better.
>
>   The 6908 *is* better than either the Z80 or the 6502 (yes, I'm one of
> *those* 8-)

Hurrah! :-D

>> So now, it's Unix except for the single remaining mainstream
>> proprietary system: Windows. Unix today means Linux, while the
>> weirdoes use FreeBSD. Everything else seems to be more or less a
>> rounding error.
>
>   There are still VxWorks and QNX in embedded systems (I think both are now
> flying through space on various probes) so it's not quite a monoculture.
> But yes, the desktop does have severe moncultures.

True. I own a Blackberry Passport, a lovely QNX device.

But it's for sale. I have a new cheap Chinese Android phone that does
much much more. :-(

It's all about the apps...

>> C always was like carrying water in a sieve, so now, we have multiple
>> C derivatives, trying to patch the holes.
>
>   Citation needed.  C derivatives?  The only one I'm aware of is C++ and
> that's a far way from C nowadays (and no, using curly braces does not make
> something a C derivative).

Directly? Objective C, D.

Indirectly? Well, almost anything written in it. Your curly-braces
point is true, but the influence is, I feel, quite pervasive.

>> C++ has grown up but it's
>> like Ada now: so huge that nobody understands it all, but actually, a
>> fairly usable tool.
>>
>> There's the kinda-sorta FOSS "safe C++ in a VM", Java. The proprietary
>> kinda-sorta "safe C++ in a VM", C#. There's the not-remotely-safe
>> kinda-sorta C in a web browser, Javascript.
>
>   They may be implemented in C, but they're all a far cry from C (unless
> youmean they're imperative languages, then yes, they're "like" C in that
> reguard).

This point is getting made to me on FB as well. I think there is a
clear /influence/ and some C-isms -- direct memory allocation, pointer
manipulation and so on -- are widespread /because/ of the C family
influence. And I have a deep suspicion that these are harmful things.

>> And dozens of others, of course.
>
>   Rust is now written in Rust.  Go is now written in Go.  Same with D.
> There are modern alternatives to C.  And if the community is anything to go
> by, there is a slowly growing contigent of programmers that would outlaw the
> use of C (punishable by death).

Do you really think it's growing? I'd like very much to believe that.
I see little sign of it. I do hope you're right.


>> So they still have C like holes and there are frequent patches and
>> updates to try to make them able to retain some water for a short
>> time, while the "cyber criminals" make hundreds of millions.
>
>   I seriously think outlawing C will not fix the problems, but I think I'm
> in the minority on that feeling.

We would, of course, merely get different problems instead. ;-)

>> Anything else is "uncommercial" or "not viable for real world use".
>>
>> Borland totally dropped the ball and lost a nice little earner in
>> Delphi, but it continues as Free Pascal and so on.
>>
>> Apple goes its own way, but has forgotten the truly innovative
>> projects it had pre-NeXT, such as Dylan.
>>
>> There were real projects that were actually used for real work, like
>> Oberon the OS, written in Oberon the language. Real pioneering work in
>> UIs, such as Jef Raskin's machines, the original Mac and Canon Cat --
>> forgotten. People rhapsodise over the Amiga and forget that the
>> planned OS, CAOS, to be as radical as the hardware, never made it out
>> of the lab. Same, on a smaller scale, with the Acorn Archimedes.
>
>   While the Canon Cat was innovative, perhaps it was too early.  We were
> still in the era of general purpose computers and the idea of an
> "information appliance" was still in its infancy and perhaps, not an idea
> people were willing to deal with at the time.  Also, how easy was it to get
> data *out* of the Canon Cat?  (now that I think about it---it came with a
> disk drive, so in theory, possible)  You could word process, do some
> calculations, simple programming ... but no Solitare.

True.

>   As for CAOS, I haven't heard of it (and yes, I did the Amiga thing in the
> early 90s).  What was unique about it?  And as much as I loved the Amiga,
> the GUI API (at least 1.x version) was very tied to the hardware and the OS
> was very much uniprocessor in design.

There's not a lot about it out there, but there's some.

http://amigaworld.net/modules/newbb/viewtopic.php?topic_id=35526&forum=32&14

>> Despite that, of course, Lisp never went away. People still use it,
>> but they keep their heads down and get on with it.
>>
>> Much the same applies to Smalltalk. Still there, still in use, still
>> making real money and doing real work, but forgotten all the same.
>>
>> The Lisp Machines and Smalltalk boxes lost the workstation war. Unix
>> won, and as history is written by the victors, now the alternatives
>> are forgotten or dismissed as weird kooky toys of no serious merit.
>>
>> The senior Apple people didn't understand the essence of what they saw
>> at PARC: they only saw the chrome.
>
>   To be fair, *everybody* missed the essence of what they did at PARC; even
> Alan Kay wasn't of much help ("I meant message passing, *not* objects!"
> "Then why didn't you say so earlier?" "And Smalltalk was *never* meant to be
> standardized!  It was meant to be replaced with something else every six
> months!" "Uh, Alan, that's *not* how industry works.").

True, true. :-(

>   The problem with the Lisp machines (from what I can see) is not that they
> were bad, but they were too expensive and the cheap Unix workstations
> overtook them in performance.  Had Symbolics and LMI moved their software to
> commodity hardware they might have survived (in this, Bill Gates was
> right---it was software, not hardware, where the money was).

Symbolics ultimately did, of course -- the last iterations of Genera
ran on Tru64 on Alphas, but under an emulator.

>   Smalltalk has other issues.  In the 80s, there were not many machines
> capable of running Smalltalk (I'm not aware of any implementation on micros,
> serious or not) and by the time workstations where becoming cheap enough,
> the Smalltalk vendors were charging rediculous amounts of money for
> Smalltalk.  The other issue is the image---how do you "distribute" software
> written in Smalltalk?  Smalltalk is the application is Smalltalk, so you
> have to buy into the whole ecosystem (perhaps in the 90s---today, probably
> less so).

Both are serious issues, yes.

>> They copied the chrome, not the
>> essence, and now all that *any* of us have is the chrome. We have
>> GUIs, but on top of the nasty kludgy hacks of C and the like. A
>> late-'60s skunkware project now runs the world, and the real serious
>> research efforts to make something better, both before and after, are
>> forgotten historical footnotes.
>
>   The essence being (LISP|Smalltalk) all the way from top to bottom?

Certainly part of it, yes.

>> Modern computers are a vast disappointment to me. We have no thinking
>> machines. The Fifth Generation, Lisp, all that -- gone.
>>
>> What did we get instead?
>>
>> Like dinosaurs, the expensive high-end machines of the '70s and '80s
>> didn't evolve into their successors. They were just replaced. First
>> little cheapo 8-bits, not real or serious at all, although they were
>> cheap and people did serious stuff with them because it's all they
>> could afford. The early 8-bits ran semi-serious OSes such as CP/M, but
>> when their descendants sold a thousand times more, those descendants
>> weren't running descendants of that OS -- no, it and its creator died.
>
>   I beg to differ.  MS-DOS *was* CP/M (or rather, CP/M on the 8086).  I've
> studied CP/M and have had to implement a portion of MS-DOS for a one-off
> project of mine [1] that used the CP/M era system calls of MS-DOS (MS-DOS
> later got more Unix like system calls, which are actually nicer to use than
> the CP/M versions).

Well, yes,  OK, but I directly addressed that in the next paragraph or
so of the text you quoted!

MS-DOS was QDOS, QDOS was a CP/M knock-off.

>> CP/M evolved into a multiuser multitasking 386 OS that could run
>> multiple MS-DOS apps on terminals, but it died.
>>
>> No, then the cheapo 8-bits thrived in the form of an 8/16-bit hybrid,
>> the 8086 and 8088, and a cheapo knock-off of CP/M.
>>
>> This got a redesign into something grown-up: OS/2.
>>
>> Predictably, that died.
>
>   Because Microsoft and IBM had different goals.  Microsoft wanted a 32-bit
> version initially; IBM did not want to eat into their mini and main frame
> business and IBM had the clout to push their agenda and thus, OS/2 1.x for
> the 80286.  The partnership soured in 1990, Microsoft went with NT and IBM
> took over OS/2 but didn't have the marketing clout (or savviness) if
> Microsoft.

Absolutely, yes.

>> So the hacked-together GUI for DOS got re-invigorated with an
>> injection of OS/2 code, as Windows 3. That took over the world.
>>
>> The rivals - the Amiga, ST, etc? 680x0 chips, lots of flat memory,
>> whizzy graphics and sound? All dead.
>
>   The 68k series dies because Motorola could not compete with Intel.
> Morotola did more than just chips (radios, cell phones) whereas Intel did
> nothing *but* chips, and with Microsoft pushing computers into every home (a
> definite goal of Bill Gates, but of course all them running Microsoft
> software) gave Intel a significant bankroll to make the x86 line performant
> (and no one really wanted the Intel RISC chips; at least, not to the degree
> to make continued development profitable).

An interesting analysis. Not heard that argument before. Thanks!

>> Then Windows got re-invented with some OS/2 3 ideas and code, and some
>> from VMS, and we got Windows NT.
>>
>> But the marketing men got to it and ruined its security and elegance,
>> to produce the lipstick-and-high-heels Windows XP. That version,
>> insecure and flakey with its terrible bodged-in browser, that, of
>> course, was the one that sold.
>
>         “Consistent mediocrity, delivered on a large scale, is much more
>         profitable than anything on a small scale, no matter how efficient
>         it might be.”

:'(

>> Linux got nowhere until it copied the XP model. The days of small
>> programs, everything's a text file, etc. -- all forgotten. Nope,
>> lumbering GUI apps, CORBA and RPC and other weird plumbing, huge
>> complex systems, but it looks and works kinda like Windows and a Mac
>> now so it looks like them and people use it.
>
>   COBRA was dead by the mid-90s and had nothing (that I know of) to do with
> Linux.  And the lumbering GUI apps, RPC, etc that you are complaining about
> is the userland stuff---nothing to do with the Linux kernel (okay, perhaps
> I'm nitpicking here).

GNOME 1 was heavily based on CORBA. (I believe -- but am not sure --
that later versions discarded much of it.) KDE reinvented that
particular wheel.

>> Android looks kinda like iOS and people use it in their billions.
>> Newton? Forgotten. No, people have Unix in their pocket, only it's a
>> bloated successor of Unix.
>>
>> The efforts to fix and improve Unix -- Plan 9, Inferno -- forgotten. A
>> proprietary microkernel Unix-like OS for phones -- Blackberry 10,
>> based on QNX -- not Androidy enough, and bombed.
>>
>> We have less and less choice, made from worse parts on worse
>> foundations -- but it's colourful and shiny and the world loves it.
>
>   Look at any industry and the pattern repeats.

I daresay you're right, but some seem to be improving.

I remember a comment from my non-techie neighbour back in the UK... I
owned the right-hand half of the semi-detached house for 12y, he the
left for 24. He commented that he used to manoeuvre his car, with
difficulty, into the too-small garage "back when cars went rusty", but
he no longer bothered.

I don't like cars. I'm more into motorbikes. I didn't know car tech
had moved on like that. Cars don't rust any more? So I did some
digging, and yes, it appears to be true. Yes, they're more closed,
harder for the owner to maintain -- but C21 cars need much less
maintenance and do not corrode any more.

Motorbikes, meanwhile, are stunningly lightweight and yet powerful by
the standards of my own youth in the 1980s, when they'd not moved on
much from the '60s.

Bicycles have also undergone amazing changes this century -- wireless
electronic self-calibrating gear shifting, hydraulic disk brakes,
suspension as standard, remarkably lightweight so that racing machines
need to be ballasted to meet ancient obsolete weight requirements.
This even permits power-assist to be concealed entirely within the
standard frame of racing bikes now -- motor, gearing, batteries, the
lot!

Computers? They run less hot now than a decade ago. Use less
electricity. Fancy ones have better 3D. Some have SSDs. That's about
it.

Compared to a decade before that? Better but more restrictive
firmware. Slimmer cabling, faster buses. More cores.

Compared to a decade before that? Now the OSes are more solid and
reliable. They can do video and 3D with less work now, even within a
GUI. The ports are smaller, simpler, more robust. The internal
interconnects have changed and the OSes now have proper 32-bit
kernels.

Actual functionality hasn't vastly changed since the mid-90s, it's
just got better.

The mid-90s PC merely managed to reproduce the GUIs, multitasking and
sound/colour support of mid-80s proprietary systems, on the COTS PC
platform.

I'd argue the last big change was the Mac and GUIs, just over 30 years ago.

And I reiterate:

>> That makes me despair.
>>
>> We have poor-quality tools, built on poorly-designed OSes, running on
>> poorly-designed chips. Occasionally, fragments of older better ways,
>> such as functional-programming tools, or Lisp-based development
>> environments, are layered on top of them, but while they're useful in
>> their way, they can't fix the real problems underneath.
>>
>> Occasionally someone comes along and points this out and shows a
>> better way -- such as Curtis Yarvin's Urbit.
>
>   I'm still not convinced Curtis isn't trolling with Urbit.  Like Alan Kay,
> he's not saying anything, expecting us to figure out what he means (and then
> yell at us for failing to successfully read his mind).

Oh no, he has built something amazing, and better still, he has a plan
and a justification for it. I fear it's just too /different/ for most
people, just like functional programming is.

>   -spc (Have dealt with my share of "geniuses" yelling at me for failing to
>         read their mind ... )

I can relate to that.

> [1]     https://github.com/spc476/NaNoGenMo-2015
>         https://github.com/dariusk/NaNoGenMo-2015/issues/184
>



-- 
Liam Proven • Profile: http://lproven.livejournal.com/profile
Email: lproven at cix.co.uk • GMail/G+/Twitter/Flickr/Facebook: lproven
MSN: lproven at hotmail.com • Skype/AIM/Yahoo/LinkedIn: liamproven
Cell/Mobiles: +44 7939-087884 (UK) • +420 702 829 053 (ČR)


More information about the cctalk mailing list