On Wed, 27 Apr 2016, Liam Proven wrote:
I wish to apologise for this. It was unjustified and
unfair, and
unjustly ad-hom as well.
Well, that's mighty big of you Liam. You are clearly a brilliant guy with
a storied career and bristling with skills I only wish I had. As I read
through your post here, I also got a lot more grit and understanding of
why folks get as irritated as they do when I associate my bumbling college
profs with something like LISP. It's silly of me to associate a language
with a group of people. It's human, but still not very bright of me. LISP
certainly has a lot of smart people advocating for it. It seems to
represent a lost ideal or paradigm to them and it's I can see it's nasty
of me to step on that, even if that's not my direct design to hurt them.
My contention is that a large part of the reason that
we have the crappy
computers that we do today [...] is not technical, nor even primarily
commercial or due to business pressures, but rather, it's cultural.
I share your lament.
the culture was that Real Men programmed in assembler
and the main
battle was Z80 versus 6502, with a few weirdos saying that 6809 was
better than either.
One thing that also keeps jumping out at me over and over is how I meet
people with the same kind of experiences you describe and they are often
much more skilled and better critical thinkers than folks I know from my
generation or younger. Don't get me wrong, there are plenty of young
shining stars, but they just don't seem to occur with the same frequency.
I have surmised that I am standing on the shoulders of giants going all
the way back to folks like Grace Hopper, and that it's more and more
difficult to grow in this field in the same way as the "old timers" (which
for me is anyone who worked in the industry before 1989, I realize it's
all relative). All "you guys" seemed to start out with math or EE
background and filling in the CS parts seems to be trivial for you. I look
up to your generation, believe it or not.
The labs had Acorn BBC Micros in -- solid machines,
*the* best 8-bit
BASIC ever,
I'm a bit sad those never caught on in the states. They are neat machines.
But a new wave was coming. MS-DOS was already huge and
the Mac was
growing strongly. Windows was on v2 and was a toy, but Unix was coming
For me, as a teen in the 1990s. I associated Unix with scientists,
engineers, and "thinkers" in general. I'd walk into somewhere to fix a
monitor or printer (I was a bench tech for a while) and the Unix guys
wouldn't want me near their stuff. They could fix it themselves and they
didn't want some punk kid who knew MSDOS to touch them. Meanwhile all the
people I didn't respect (PHBs and other business-aligned folks) used beige
boxes running DOS. I knew I was in the wrong place.
A new belief started to spread: that if you used C,
you could get
near-assembler performance without the pain, and the code could be
ported between machines. DOS and Mac apps started to be written (or
rewritten) in C, and some were even ported to Xenix.
Wasn't that kind of true, though? I've heard it said "C is nothing more
than a macro assembler". I'm a C programmer as you might expect. I'd
heartily agree. However, being a C programmer I also see C's warts. It's
not big on syntactic sugar, but it does get the job done in a
straightforward and pragmatic way. I do plenty of OO in other languages,
but I still prefer procedural & structured coding techniques most of the
time (as long as it effectively solves the problems). However, I'd reach
for an OO language or heavily abuse callbacks in C if I needed to do a
simulation.
[...] Apple merged with NeXT and switched to NeXTstep.
I still like to show non-coders how all these OSX library calls often
*still* start with "ns" (next step).
Now, Windows is evolving to be more and more
Unix-like, with GUI-less
versions, clean(ish) separation between GUI and console apps, a new rich
programmable shell, and so on.
It reminds me of the famous statement "Those who do not learn from Unix
are doomed to re-invent it... poorly." Windows 10 inclusion of bash seems
to me to be a white flag on the part of Microsoft saying "You guys were
right." Especially after pushing Powershell so hard. Of course, I'm sure
the devils advocate would say "It's just a greater diversity in a massive
constellation of functionality in Windows"
While the Mac is now a Unix box, albeit a weird one.
A very weird one. Most of my Unix zealot friends use Macs now. I still use
the console :-) If I must, then I'll use fluxbox in X11 on top of NetBSD.
Commercial Unix continues to wither away. OpenVMS
might make a modest
comeback.
Go VMS. I hope VSI can pull it off. I also hope they change their
licensing terms to be less draconian. IMHO, that's what hurt them so badly
in the 90s after Ken Olsen left. They wanted to LMF license every little
bit of Tru64 and VMS. There is the hobbyist program, yes. However, you
don't need to mess with that to download Linux, and that's still an
accessibility gap.
IBM has successfully killed off several efforts to do
this for z Series.
In order to prevent your goose that laid the golden egg from losing her
value (as a dead or emulated goose), you need to kill the goose hunters.
:-)
So now, it's Unix except for the single remaining
mainstream proprietary
system: Windows. Unix today means Linux, while the weirdoes use FreeBSD.
Everything else seems to be more or less a rounding error.
Color me weirdo and rounding error since I mainly use NetBSD. It doesn't
change the truth of your statement, though. :-)
Even the safer ones run on a basis of C
IMHO, C is the most portable language in the world, not Java. IIRC, many
java runtimes are still written in C. Anytime someone makes a new
processor one of the first things they do is port a C compiler, making a
lot more stuff possible.
Perl has abandoned its base, planned to move onto a
VM, then the VM went
wrong, and now has a new VM and to general amazement and lack of
interest, Perl 6 is finally here.
I started learning and using Perl until all this weirdness happened and
they bolted on OO etc... I'm not anti OO but it was all just too much more
me. I bolted to Ruby and Lua. No disrespect to the Perl Monks, I just
couldn't hang anymore.
So they still have C like holes and there are frequent
patches and
updates to try to make them able to retain some water for a short time,
while the "cyber criminals" make hundreds of millions.
I mostly agree, but I would like to say a few things about security issues
and C.
* Buffer overflows and string format exploits are the biggest
side-effects, security wise. They aren't nearly as common as they were.
There is greater awareness among programmers, compilers, and scanning
tools now (valgrind, rats, etc). Plus the exploit writers have found
more fertile ground in things like SQL injection and CGI interfaces.
* They are pretty easy to prevent most of the time, especially if you care
enough to check or use something like stack-smashing prevention in your
compiler. Also thinks like OS heap randomization has made exploits much
harder.
* Yes, they still happen, and still *can* happen, so I don't dispute that.
Anything else is "uncommercial" or "not
viable for real world use".
I think this is some phrase I often throw around a little too cavalier.
Just because a language isn't popular doesn't mean I can't learn something
from it.
Borland totally dropped the ball and lost a nice
little earner in
Delphi, but it continues as Free Pascal and so on.
It boggles me, actually. There were some awesome Delphi coders out there.
I thought they'd never be derailed, because they were actually *very*
effective coders I'd seen really powering certain businesses. Did Borland
hork things up or what ?
Apple goes its own way, but has forgotten the truly
innovative projects
it had pre-NeXT, such as Dylan.
And amazing things like Amoeba, Sprite, MOSIX and others have also sort of
dried up and died on the vine. There were some great ideas there.
The Lisp Machines and Smalltalk boxes lost the
workstation war. Unix
won, and as history is written by the victors, now the alternatives are
forgotten or dismissed as weird kooky toys of no serious merit.
I also get that things are lost in this type of "war" the merits and
interesting side of LISP machines etc.. It's not a good thing.
CP/M evolved into a multiuser multitasking 386 OS that
could run
multiple MS-DOS apps on terminals, but it died.
Hmm, was that before or after things like Desqview came along? I'd
probably guess that's why or just the momentum DOS had for a while.
So the hacked-together GUI for DOS got re-invigorated
with an injection
of OS/2 code, as Windows 3. That took over the world.
Which was hard to believe for me, too. It must have been a cost thing.
Folks could have had an Amiga, ST, Acorn, Mac, OS/2 (depending on how far
back), a low-end Unix box, etc.. However, I guess ultimately people wanted
whatever they could go to their local software shop and get software for.
It still is a mystery to me why DOS was so popular.
But the marketing men got to it and ruined its
security and elegance, to
produce the lipstick-and-high-heels Windows XP. That version, insecure
and flakey with its terrible bodged-in browser, that, of course, was the
one that sold.
I know, man, there is no accounting for taste.
Linux got nowhere until it copied the XP model.
Now, as you also allude to, they have started to copy Windows. Linus seems
to have shifted his attitudes greatly. Check out what he says about GGI
then what he says about Systemd:
From:
http://marc.info/?l=linux-kernel&m=89089527200744&w=2
"I don't see the world in black-and-white. I don't actually like
Linux-only features unless they have a good reason for them, and I
really like Linux to be a "standard" system " -Linus 1998
" I'm distrustful of projects that do not have well-defined goals, and
well-defined interfaces. They tend to bloat and do "everything" over
time. This is what gives us horrors like GNU emacs and Mach: they
don't try to do one thing well, they try to do _everything_ based on
some loose principle [..]" -Linus 1998
Now fast forward to 2015:
"I have to say, I don't really get the hatred of systemd. I think it
improves a lot on the state of init, and no, I don't see myself getting
into that whole area." -Linus 2015
I'll leave the readers to decide if that's hyperbole or there is something
real there. It'll probably split right down the fracture point with
systemd haters vs advocates, I suppose. It still seems pretty emblematic
to me.
[...] plumbing, huge complex systems, but it looks and
works kinda like
Windows and a Mac now so it looks like them and people use it.
My theory is that once folks woke up to the potential the Internet had for
improving their real lives, they didn't care how much they polluted the
computing world to get access to that power, and they weren't about to
abide having to learn anything new if they didn't have to.
Android looks kinda like iOS and people use it in
their billions.
Newton? Forgotten. No, people have Unix in their pocket, only it's a
bloated successor of Unix.
My personal opinion is that though those devices might give devs a taste
of *some* of the power of Unix, none of those devices show the *user* any
of the so-called "Unix philosophy" (KISS, everything is a file, etc..).
What's also sad is that those users don't give a hoot. However, I'm not
really surprised. I grew up in an era when computers were NOT cool for
kids. If you liked them or wanted to play games on a computer, that made
you a "nerd" or "geek" when those words were purely pejorative. Now
those
same people can't look up from their phones long enough to keep from
falling down stairs, walking in front of subway trains, or uhh.... living.
I still carry a Symbian phone since I find both Android and iOS so
invasive and annoying. It's a strange shake up on the world I grew up in.
The efforts to fix and improve Unix -- Plan 9, Inferno
-- forgotten.
Plan 9 is still being (very slowly) developed. Ken is still involved last
I heard.
We have less and less choice, made from worse parts on
worse foundations
-- but it's colourful and shiny and the world loves it. That makes me
despair.
Right there with you, Liam. Someone posted today about the good parts of
the Internet that we got with the deal. Massive communication and
documentation are truly positive side effects for the most part. However,
I suppose I have mixed feelings, nonetheless.
We have poor-quality tools, built on poorly-designed
OSes, running on
poorly-designed chips.
Yes, and even though there is *more* overall documentation on the
Internet, the docs you get with hardware and tools are nowhere near as
good as they were in the 80s AFAIK. Nobody ships manuals with source code
and schematics. The last I saw of that was BeOS and The Be Book.
Occasionally someone comes along and points this out
and shows a better
way -- such as Curtis Yarvin's Urbit. Lisp Machines re-imagined for the
21st century, based on top of modern machines. But nobody gets it, and
its programmer has some unpleasant and unpalatable ideas, so it's
doomed.
Well, when it comes to that front, I've been in this business for about 20
years now, and I don't understand those efforts.
And the kids who grew up after C won the battle deride
the former
glories, the near-forgotten brilliance that we have lost.
I see that and I can better appreciate where you are coming from when you
couch it this way. Only the good die young. "Some will rise by sin and
some by virtue fall." -Shakespear
I apologise unreservedly for my intemperance. I just
wanted to try to
explain why I did it.
I completely understand. I am sorry for associating LISP with some crappy
experiences I had in school 20 years ago. You and other folks come from a
noble tradition. It was wrong of me to scorn that, even for an unrelated
reason.
-Swift