On 06/04/2013 09:13 PM, Liam Proven wrote:
But it has
to be BETTER.
The big question is, does it?
Well I suppose I should've qualified that. Yes, it has to be "better",
but
that's defined in different ways for different people. For most people
today, I say through gnashed teeth, it means "cheaper". For people in upper
management of most companies, it means "whatever will get the salesman to
give me those tickets to the big game".
To me, along with a disappointingly vanishingly small list of other people,
it means "does the job better". That can be in terms of performance,
reliability, manageability, or (more commonly) a combination of those things.
(you know all this; I just wanted to spell out my thoughts)
Is Unix better than Lisp Machines were? Is Solaris
better than Linux?
Are cheap x86 Linux servers better than SPARCs with Solaris or
RS/6000s with AIX? Is Unix better than OpenVMS? Is running your own
servers better than outsourcing it to the "cloud"?
I would not answer an unambiguous "yes" to any of these.
An *unambiguous* yes? Of course not. Neither would I.
"Better" is a composite, which comes out of
lots of factors - in this
context, fitness for purpose; performance; power usage; cooling
requirements; expected service life; and of course, the big one and
increasing, cost.
Yes, reference what I typed above.
At the end of the day, in most thing, cost tends to
win out.
Yes. *grumble*
I'd love to run Sun x86 workstations, or Macs, as
my PCs, but I can't
afford to. I have PCs to earn me money, not as a hobby, so I run the
absolute cheapest kit I can and try to make sure that I always have at
least one other fallback box to go back to in the event of a system
failure. A fallback PC desktop, a Mac too, a fallback notebook, etc.
Cheaper.
This is certainly a valid approach in my view. I don't think it's the
*best* approach, and it's not what *I* do, but I can't see myself faulting
you for it, because it works for you.
My approach is a bit different, and is summed up by the old line, "I'm not
rich enough to use cheap tools."
Yes, I'm an idealistic perfectionist, but I can function that way because
I'm very lucky to have the knack of being able to find amazingly good deals
on top-end hardware, be it HP/Tektronix/Fluke test equipment, Sun/HP/etc
computers, Metcal soldering equipment, etc. It's not impractical to use
top-notch stuff if you can get it without paying list price, and it pays in
the long run. I get lucky, and I appreciate that because I know that not
everyone else does.
Do not
attempt to "lead" me in this manner again. It is childish and
petty, and it is a waste of your time and mine. Up until now I've derived
*some* degree of enjoyment from our debates...but when you start acting like
your predictions are automatically foregone conclusions, and then you try to
"lead" people in this way, it just becomes little more than infuriating.
I should not have said anything. I have been trying to learn to
improve my debating style, but I find it unexpectedly difficult. I
qualified as a TESOL (TEFL) teacher in February and in so doing I
learned an awful lot about presentation techniques and so on. Some of
them I find have application in every day life.
I understand. No worries.
Your approach to conversation and debate, Dave, is
/exceptionally/
confrontational and quickly and easily turns hostile. You are very
fast to get personal, to go for the "hey buddy, I've been doing this
for years, don't you try to tell me" response.
Yes. I acknowledge that this is a failing of mine. I am the most
"trollable" person around. But here, there are good reasons for my reactions.
As you well know, this forum is a common-interest one based on classic
computers. It is not a list of technical professionals, industry-experienced
engineers, or even people who have ever used what we consider to be a classic
computer, or any other kind of computer. It's just people who are interested.
And as it turns out, Very few of the people here have actually used these
"classic" computers before this became a recognized hobby. Even fewer have
used them when they were current technology. They weren't THERE, seeing and
using them in "real life".
Also, very few here have actually written software or designed hardware.
Very few have designed a large-scale computer network, or even a small-scale one.
I do not consider this to be a "fault", or anything even approaching a
negative thing. It's just the way it is. One needn't have done any of the
above to have an interest in, or a love for, classic computing, and thus be
able to contribute meaningfully to this community.
I *have* done these things. Now, before you take that as a blatantly
egotistical assertion...I know I'm not the most knowledgeable person here,
and I know I'm FAR from the most experienced. There are a few old-timers
here whose experience and expertise I practically worship (they likely don't
even know it), and I learn something from practically every one of their
posts. (I wish my own S/N ratio were as high! ...something to aspire to)
But a n00b I am not. I've done these things, I do them now, I'm a qualified,
respected professional in this industry and I'm proud of my work.
But this being a forum in which many (most?) people do NOT have that level
of experience, I feel as if some people here (you in particular, when we get
into these things) are talking to me as if I'm one of the ones who wasn't
actually THERE, and who doesn't actually DO this stuff as anything more than
a hobby. Lumping me in, as it were. That bothers me. It probably shouldn't
but it does.
So my "bludgeoning" reactions (thanks to Ken for that great image ;)) are
somewhere between:
"Yes, thank you, no need to explain further, I already know that."
and
"Don't try to feed me that line of bull, boy, because I've been there and
done that."
...depending on the situation and, more specifically, the presentation.
I particular, toward the second reaction, people have been trying to tell
me that "serious computers" are "going away" for a very long time. We
had a
386 box in the lab where I worked at DoD ~23 years ago, in a roomful of SGIs,
and a couple of the guys just would not stop talking about the "SGI
dinosaurs" that would be "replaced by PCs any day now". Those
"dinosours"
were brand-new, top-of-the-line visualization workstations with $50K price
tags, and that's where all the work got done...because they were the best
tools for the job, being used by scientists who understood why.
Yes, PCs grew up (a bit) and replaced the SGI workstations...but fully
TWENTY YEARS LATER. That's a long time. And SGI eventually became
irrelevant not due to PCs being better in some way, but to their own
management being absolute idiots.
I was thus trying something different, something I got
from Dale
Carnegie - rather than telling you what my view was, to try to draw
you out and get you to go to the same place on your own, because as
far as I can tell, you are the sort of person who will instantly
rebuff anything you're just told, just by innate reaction, even if it
is actually something that you personally would normally agree with.
Huh?? Wow, I'm surprised that you see me that way. I don't think that way
(at least not consciously) at all. I generally embrace new things, I love
learning, and I have no issue with being proven wrong. But if someone tries
to tell me something that I know damn well to be patently false, then yes, I
react poorly.
You are possibly the most in-your-face person on this
whole list, and
I was trying to avoid provoking your very common "I know better than
you so STFU" style of response. Instead, by admitting what I was
doing, I've evoked it a different way. That's a shame and I have blown
it. :?(
As above, no worries, I understand.
Not sure about that. I am seeing a general overall
trend towards
commoditisation in all aspects of computing, and I do mean all. If
there are exceptions - and I am not saying that there are not - then
they are in areas that I don't know about. But there are lots of
those!
Of course; we've all seen that trend. I think, though, to consider it to
be 100% universal would be a mistake. Lots of people shop at Wal*Mart, but
not everyone puts up with the garbage.
Oh yes, you're a Brit...Wal*Mart is, by far, the largest retail chain here
in the US. They have muscled themselves into every town and forced every
other store out, and they carry the absolute worst, cheap Chinese plastic
garbage you'll ever see. And they sell it the cheapest, so people buy it.
And they don't seem to mind when they have to replace it when it breaks.
They don't understand the false economy. And when Wal*Mart is the only store
left, if they don't have what you need, you're screwed.
And most people just don't get it.
That may
very well be. But if it does happen, it's a long way off. My new
Core i7 Quad is now JUST able to keep up with my eight-year-old Sun V480 that
I just decomissioned. At almost three times the clock rate.
Fascinating. For what kind of workloads? What do you ascribe this to?
High process and thread granularity server workload, mostly network I/O,
effectively zero floating point. I ascribe it to a balanced architecture
that was *designed* rather than slapped together. More specifically, I
ascribe it to excellent memory bandwidth, multiple I/O paths, no significant
bottlenecks, and a kernel that handles threading better than any other OS
I've seen.
Yes, that last item is an OS thing, not a hardware thing, but I treat this
at the "system" level, including the OS.
ARM is in the ascendant now because performance:Watt
is becoming a
decisive factor, and the humble Raspberry Pi shows that it can still
score in price:performance too. But in raw performance, it hasn't got
a hope and I don't think it ever will again.
I'm right there with you on that. It's "better" than some machines
based
on metrics other than performance.
-Dave
--
Dave McGuire, AK4HZ
New Kensington, PA