On Wed, 22 Jun 2016, Chuck Guzis wrote:
You don't seem to realize how expensive these
systems were to operate.
I'm sure I don't (but hey! give me credit for trying). I'm betting that
even "expensive" systems of today (barring ultra-massive HPC SSI rigs) are
cheap by comparison in inflation-adjusted dollars. I'm 41. I was born in
the mid-70's and it was awfully time consuming afterwards learning
pre-requisites to my computing skills, such as they are. Things like
talking, correct potty techniques etc.. :-) Time I could have spent
mainframin' if I'd only been taller. :-)
Plus, I have all kinds of different experiences "coming up", as it were.
Somewhere near my moment of conscious singularity, it was the age of the PC
and the micro. Sure, mainframes were (and kinda still are) going strong.
Even for guys who lived it in the 1960's, they didn't grow up with
computers around the house I'm betting. So, it's a bit of a shift in our
thinking.
IIRC, the *internal* COMSOURCE price charged for an
hour of block time
was about $500 (that's 1970s dollars, thank you--when $3K would buy you
a nice imported sports car).
Hmm, and you'd want an imported one if it was the 1970s. Ugh. The drugs
might have been good and the women more friendly, but the cars were a drag
vis-a-vis the 1960s, OMG... What happened.... The 1980's weren't too much
better (but I have soft spot for Fieros at least).
There *was* a PP program called o26 (after the
keypunch) but used mostly
by CEs and systems installation people. Using the console of such an
expensive machine by ordinary users would have been viewed as something
akin to feeding $20 bills into a regular keypunch.
Hmm, I totally didn't consider that, but certainly seems like the kind of
way people would act around something so valuable and expensive. I guess
my "altar of the console" phrase wasn't so bad, then (ie.. half true).
Remember, this was in the era of "glass walled
rooms", where only
selected people were allowed to touch the machine.
Well, I started with computing very early, too. So, nowadays when I see
those rooms I'm normally in the priesthood and am _expected_ to go fix
some very expensive systems (I've put hands on production Origin 3k's, a
couple of UNICOS based Cray systems, lots more). So, I kinda forgot about
the outsider perspective, too. With many fewer systems around to work
with, I'm sure the glass-walled sort of exclusivity got that much more
serious.
I once made the mistake of trying to mount my own tape
at a military
base. I thought that I was going to be tackled and led away by MPs.
They have people stationed by each bank of tape drives who do nothing
but that all day. They don't take kindly to someone trying to take
their job.
LOL, there's initiative and there's orders. You had initiative. They
had orders.
You don't understand. Big iron did multitasking
and multiprogramming,
but the I/O media was cards, tape and printed output. Going "online"
was expensive and slow in terms of equipment.
Hmm, but wouldn't being able to multitask and multiprocess lead directly
to wanting to have multiple terminals, card readers, tape readers, etc..?
Then wouldn't someone be responsible for setting process priorities and
managing the process scheduling? I told you I was an igmo on this. All I
have to go on are sketchy internet archives that don't give me a sense of
the real way these systems were used by real people. The only other
source about mainframes I had growing up was my grandmother, whom I
worshiped as a child and worked on IBM gear. However, who knows what kind
of kid-brained-notions I have grown into some personal fiction since then.
The rolled-out job didn't lose its files or place
in the running job
queue; it just got represented by a placeholder bit of memory (usually
the exchange package) and then read back in when its turn came up.
Ohhh, okay. That makes sense.
Sometimes the non-classified parts were repurposed.
Some were, yes, and you cite some great examples. I'm just saying I'm more
in favor of dumping bazillions of dollars on projects that more or less
default into the public domain rather than default into top secret and
someone has to justify having them released or shared. SDI in the 1980s
sticks in my craw, I suppose. Unless of course they were secretly very
successful and, in that case, I say "awesome"! I am strongly in support of
not getting nuked!
Who do you think Seymour Cray sold to as his initial
customers for his
boxes?
Fair point. I'm not anti-military, but the NSA I'd like to see behave more
responsibly.
There was a lesson to be learned from it--give the I/O
to another,
cheaper, machine. You'll see that philosophy in the later Cray
machines.
I've noticed with really large systems they have some very interesting
points of cleavage (just like their sales force, hehe). They split things
up in ways that micros just aren't scaled for. It's one of the things I
find so interesting about them (same goes for the sales force *grin*).
The 6000 had no carry bits; for that matter, it had no
condition codes
at all. If you think about it, that makes instruction scheduling much
easier because you don't have to worry about instruction side-effects.
Carry bits aren't a hack I'm particularly married to anyway, just
something to use when needed, 'cause the hardware wants it that way. That
sort of thing can be done in lots of ways, as this CDC box illustrates.
B-registers can be queried by a single compare and
branch instruction.
That saves some code. I think I've seen other platforms that have similar
features, but I'm only a hack at ASM in general. There are so many clever
things I've seen people do with 68k ASM (the only one I can really get
anywhere with) that sort of blew my mind insomuch that I have no idea how
they ever got the idea for some of these algorithms or "tricks" in the
first place. Perhaps at the foot of a Zen master somewhere? There was a
guy I knew who went by "Sauron" in the 1990s... He was like Master Po.
It also helps that the 6000 is a three-address
architecture, so you
needn't clobber a source register when testing contents.
.... and another 2-3 lines saved ! It does add up, though.
To elaborate a bit, you could divide a word into
sixths, halves and
thirds (6, 18 and 12 bit) parcels and operate on each of them. So not
really a "byte".
Ah, okay. They had specific factors, then.
It was a necessary evil with long product lives, many
people, etc.
"Egoless coding". Try doing something else and you wind up with Linux
code eventually.
Agreed (sadly).
Miles of code without a clear comment on what's
happening--or why.
No comments doesn't bother me as much as one-letter variables and
over-compressing the name of everything to the point I can't figure out
what they were even getting at. People often laugh when they see my code
because of how long and descriptive the actual code is. Sometimes they
complain that it makes it slower to slog through typing everything. I will
trade code-clarity for (almost) everything else.
Worse, yet, stupid comments. I once knew a young
programmer who was
quite proud that *every* line of his code had a comment on it--even if
he never explained what he was doing.
That would be entertaining for the first few minutes, though!
-Swift