I think all of us have our own personal view on
"when computing had it's
golden age". For me, 1984 and the Sinclair QL was the peak of the
microcomputer (as opposed to the IBM PC & clones). For you, it must have
been around 197<mumble>.
Hard to say... Actually, 1979-1983, my Pr1me Days...
The beginning
of the end. I knew it then, and I was
proved right.
I dispute that: Computers and computing go from strength to strength.
There's more than just PCs out there; the mighty mainframe still rules the
roost in many places, there's Apple Macs, VAX minis, Crays, and probably
many others I can't even think of. And, for the soldering-iron fans,
embedded computing is probably stronger than it ever was - *everything's*
got a computer or three in it...
Computing today is nothing if not diverse.
Again, it's
nice to have fast, cheap
computers, but I for one would have been just as
happy for the next 20 years having fast, cheap TERMINALS
to hook to the mainframes. And the continued high cost of
entry would have kept from coming into existence an entire
generation of self-taught (and poorly so) programmers who
have and continue to crank out some of the worst software
imaginable. In the halcyon days, most of the bad code was
writtwn by the lusers themselves...
That's a bit elitist, isn't it? Besides, most of the self-taught
programmers of whom you speak are not really programmers; they're merely
users with enough knowledge to be dangerous. Besides, if it wasn't for the
microprocessor and all that it begat, this list wouldn't even be here..
No; I have worked with these people. Most of them learned how to
program before they had a chance to take a college course with
rigor; I know this is anecdotal, but take one young man I worked
with. He'd learned to program in high school, a combination of
some fragmentary knowledge on the part of the math teacher and
self-taught the rest of the way. Then got to college, where they
tried to teach him structured programming. He dismissed structured
programming completely because "it slows down both the program and
the programmer". While this is potentially true, it ignores the
truth (at the time, less so now) that more labor is spent on maintaining
code than initially writing it.
And as to lusers with a little too much knowledge... yeah,
they can be a problem, and a LART's not always at hand...
But as this list is dedicated to hardware that ranges from a
Imlac-1 (I think that was the oldest reported here recently)
to something like a Mac IIci (1991), it might have a smaller
readership sans micros, but I'd bet there'd be sufficient
interest to have the list.
And if you meant we'd not have the Internet if the micro hadn't
come about, I'd have to dispute that. It would simply be a
slower, and less saturated Internet...
Easy access to
fast, cheap computers drove the genesis of
an entire generation of self-taught programmers who didn't
give a whit for structured programming or anything else that
resembles a methodology, and who single-handedly changed the
expectations that managers have about how quickly things
get done. Sure RAD helped speed programming along, but not
nearly as much just cutting corners... which the PC made
easier... damn, I feel a song coming on again:
It wasn't the PC that made cutting corners easy; it was the near-universal
use of BASIC - a fundamentally unstructured language - that is responsible
for the bulk of the "bad programmers"; and I say that as a professional
programmer who uses BASIC....!
Well, you won't get much disagreement from me here... but I've seen
COBOL code that was more spaghetti'd out than the worst BASIC I've
seen...
Maybe if PASCAL had been the language de jour,
today's self-taught
programmers would be better at it...
Overall, yes, but to paraphrase my Data Structures, Pascal sucks
when you limit yourself to the FORTRAN subset...
No, not only
will I not celebrate it, but I need to
find a black armband to wear the rest of the month.
IMHO, no. The PC had to happen; it was just a case of who got lucky (or had
the best marketing). At the end of the day, the PC offered unrivalled
expansion possibilities, a comparatively friendly OS (Gates did well to
poach DOS), and good flexibility thanks to the lack of built
in anything.
Actually, at that moment I saw the Apple II, my thought was:
"ten years too soon. we need ten years to figure out what we
can really do with these damn things..." Ten years to develop
real operating systems, job control languages, interfaces, etc.
Personally, I'd have liked to have seen a MC68000
based machine become
today's PC (mainly because I'd already learned assembler on the QL). No
doubt Commodore fans would have preferred the C128 or Amiga to "grow up"
into the PC.
Might have been a marginal improvement... it's a nicely orthogonal
processor...
Well, I'm off to dabble with my CBM PET, or maybe
the MZ-80K. They're fun,
but I wouldn't like to have to use them every day, day in day out...
Cool, don't let my rant effect your fun!
-dq