On 4 June 2013 23:29, Dave McGuire <mcguire at neurotica.com> wrote:
On 06/04/2013 03:40 PM, Liam Proven wrote:
> Of course. But why I said what I said above,
out of exasperation, is that
> you call architectures that are currently developed, sold, and used, and have
> been for decades, with no end in sight, "failures". That's insane.
They went up against Intel. They used to come in a wide variety of
machines: low-end to high-end desktop, laptop, small server, big
server.
They don't any more. Now they are high-end or nothing.
...which is where they belong. Sun *workstations* aren't needed anymore
because cheap PeeCee hardware actually has usable graphics now. They didn't
back then. They didn't "go up against Intel" at all...they owned that
market, because of graphics capabilities, and when cheap PC hardware could do
it, it did.
OK, good. That's exactly the direction I wanted you to go.
*sigh* You have more free time than anyone else I know.
Yeah, I spend way too much time online. :(
So, my next
question is: what if (or more to the point,when) cheap PC hardware
delivers the same features & performance that SPARC and POWER do for
servers now?
If and when something BETTER comes along, they will be displaced, of
course. This is textbook first-year economics.
OK, right.
But it has to be BETTER.
The big question is, does it?
Is Unix better than Lisp Machines were? Is Solaris better than Linux?
Are cheap x86 Linux servers better than SPARCs with Solaris or
RS/6000s with AIX? Is Unix better than OpenVMS? Is running your own
servers better than outsourcing it to the "cloud"?
I would not answer an unambiguous "yes" to any of these.
"Better" is a composite, which comes out of lots of factors - in this
context, fitness for purpose; performance; power usage; cooling
requirements; expected service life; and of course, the big one and
increasing, cost.
At the end of the day, in most thing, cost tends to win out.
Clueless outfits will buy the cheap crap, just as
they do now, and just as they always have, since it has been available.
Sure, yes.
But
where build quality and reliability matters, you won't see any eMachines (or
similar) sitting on rackmount shelves in datacenters anytime soon.
No. But these days, they're absolutely full of Dells. Dell is not a
byword for quality in my mind; is it to you?
Google, of course, run more servers than anyone anywhere, and they,
famously, use bare generic motherboards in specially-designed trays in
racks and design their systems for very high redundancy and
failure-tolerance. They've run the numbers and found this is the most
economical option.
Few are brave enough to follow them, not yet - but more and more
businesses are finding it economical to just rent capacity on demand,
in the cloud, and they have no real idea what their workloads are
running on. It could be quality kit, it could be bare boards in a rack
drawer. You don't know, but if the price is right, you buy it.
I'd love to run Sun x86 workstations, or Macs, as my PCs, but I can't
afford to. I have PCs to earn me money, not as a hobby, so I run the
absolute cheapest kit I can and try to make sure that I always have at
least one other fallback box to go back to in the event of a system
failure. A fallback PC desktop, a Mac too, a fallback notebook, etc.
Cheaper.
Do not attempt to "lead" me in this manner
again. It is childish and
petty, and it is a waste of your time and mine. Up until now I've derived
*some* degree of enjoyment from our debates...but when you start acting like
your predictions are automatically foregone conclusions, and then you try to
"lead" people in this way, it just becomes little more than infuriating.
I should not have said anything. I have been trying to learn to
improve my debating style, but I find it unexpectedly difficult. I
qualified as a TESOL (TEFL) teacher in February and in so doing I
learned an awful lot about presentation techniques and so on. Some of
them I find have application in every day life.
Your approach to conversation and debate, Dave, is /exceptionally/
confrontational and quickly and easily turns hostile. You are very
fast to get personal, to go for the "hey buddy, I've been doing this
for years, don't you try to tell me" response.
I was thus trying something different, something I got from Dale
Carnegie - rather than telling you what my view was, to try to draw
you out and get you to go to the same place on your own, because as
far as I can tell, you are the sort of person who will instantly
rebuff anything you're just told, just by innate reaction, even if it
is actually something that you personally would normally agree with.
You are possibly the most in-your-face person on this whole list, and
I was trying to avoid provoking your very common "I know better than
you so STFU" style of response. Instead, by admitting what I was
doing, I've evoked it a different way. That's a shame and I have blown
it. :?(
Of course that may be what you're going for.
But your writing is good
enough, and your OPINIONS are well-thought-out enough that I'm sure you're
not a complete idiot...so I doubt you're just so bored as to spend your time
riling people up on mailing lists. So I guess I just don't know what to make
of you and your motivations.
I don't really have any single identifiable set. I'm interested in
classic computing and I find this list to be one of the most
interesting places to read that I am regularly on. I find a lot of
stuff out here. Some of it, occasionally, is even useful. I've also
met a few people IRL, which has been a great extra bonus - they've all
been fascinating chaps & I enjoyed meeting them.
Be careful; that "niche" is where a lot of
heavy lifting gets done.
Yes indeed. It's turning into a commodity market, which tends to mean
lowest-common-denominator kit.
Except where it matters.
Not sure about that. I am seeing a general overall trend towards
commoditisation in all aspects of computing, and I do mean all. If
there are exceptions - and I am not saying that there are not - then
they are in areas that I don't know about. But there are lots of
those!
We'll
see what happens in 5Y, and I suspect (and hope) that you and I will be there
to discuss it, but those machines have been there for a very long time, and I
don't see them going anywhere.
Not overnight, no. But I foresee gradual shrinkage. A slow death, just
like Itanium.
That may very well be. But if it does happen, it's a long way off. My new
Core i7 Quad is now JUST able to keep up with my eight-year-old Sun V480 that
I just decomissioned. At almost three times the clock rate.
Fascinating. For what kind of workloads? What do you ascribe this to?
If anything
causes "slow death" anytime soon, it's not likely to be PeeCees. It might
be
ARM, but that'd be a ways off too, for that level of performance.
I bought into ARM in about 1989 because at that time the
price:performance was absolutely unrivalled. My ?800 2nd-hand
Archimedes A310 was /considerably/ faster than the top-of-the-range
machine my work sold at the time, an IBM PS/2 Model 70 386DX desktop
running at 25MHz with /secondary cache/ (woo!).
The PS/2 cost about ?10,500 excluding optional extras such as a
keyboard and a copy of MS-DOS 3.3.
Over 10? the price (OK, new versus used, but still, around that new vs
new) and somewhere around a quarter down to an eighth of the
performance. And the PS/2 didn't have a 386 in, so we're not including
FP, just plain integer performance. An 8MHz ARM2 with slow DRAM ran
*rings* around an 80386DX with cache and expensive quick RAM.
So, yeah, back then, ARM was a performance king. At the time - I did
check, but I am not sure of my prices >25y later - the next box I
could find with integer performance similar to my Archie was a Sun
workstation costing well over ?20,000, possibly up to ?30K or even
?40K.
We sold Apple kit, too. We had a Mac II. It wasn't even in the ring.
The IIx that replaced it was closer and the IIfx got competitive. :?)
It was about ?15K, I think.
But the 486 closed the gap, the Pentium equalled contemporary ARMs,
the Pentium Pro edged ahead in some tasks and around the time that the
Pentium II came out, Acorn gave up and closed down its workstation
division. Great shame. The specialist chipset, graphics, sound and OS
were just too expensive to develop and build.
Doubly sad because on price:performance of just the CPU+RAM subsystem,
ARM still kicked Intel's butt. Or in performance:Watt, too, but that
was not an issue back then, of course; notebooks didn't really exist
yet, just clunky low-powered things like the GRID or Mac Portable.
ARM is in the ascendant now because performance:Watt is becoming a
decisive factor, and the humble Raspberry Pi shows that it can still
score in price:performance too. But in raw performance, it hasn't got
a hope and I don't think it ever will again.
Or we could just wait and see. I'll keep doing
it, and you keep writing
about it.
Well, that's the plan!
--
Liam Proven ? Profile:
http://lproven.livejournal.com/profile
Email: lproven at cix.co.uk ? GMail/G+/Twitter/Flickr/Facebook: lproven
MSN: lproven at
hotmail.com ? Skype/AIM/Yahoo/LinkedIn: liamproven
Tel: +44 20-8685-0498 ? Cell: +44 7939-087884