Mike Cheponis wrote:
[stuff deleted]
Wrong, what
counts is that the computer can do the job that it needs to.
That is why you'll still find systems such as the PDP-8 and PDP-11 still in
service. That is also why most people don't need anything more than a
68k-based Macintosh.
This is dead-wrong. You assume that "the job" remains static. But the
fact of the matter is, unless your computer is embedded in your microwave
or toaster, you'll want to run new and interesting s/w on it. And, in
general, that implies that you'll want a faster computer in 18 months.
*sighs* For a large dollar segment of the hardware marketplace this assertion
is simply not true. Finance applications in particular tend to run one and
only on thing on a machine or a small collection of machines; when you
run out of gas you buy a code compatable but bigger version of what you
were previously running on and repeat, praying that your vendor won't pull
the plug on your instruction set architecture and thus nuke your only
viable upgrade path.
This may not make much sense at first, but you need to understand that these
systems are mission critical and the people who use them are generally
risk adverse. Tell a trader that his trade support system will be down for
five minutes and he'll throw you out his window; as far as he (and his
employer) are concerned the world can change radically in that period of
time. The second factor has to do with the ability of these firms to
construct software, even software that simply replicates existing functionality
on a new platform. It largely doesn't exist, which is why individual
institutions can and have spent as much as $1billion (that's *one* bank) on
internal systems development and gotten *nothing* that worked as the return
on their investment. So against that backdrop a legacy architecture which
scales by installing larger and larger versions (or clusters, if the vendor
supports the notion) is quite common on the higher end of the marketplace.
[snip]
Then there is
the question of which costs less, keeping these systems
running, or converting to something new (this is both a hardware and
software quesiton). If the current system does everything you need, why
change? In some cases it makes sense to run old apps in an emulator on new
hardware.
If you're running some accounting application or airline reservations system,
sure, keep it the same until its lifetime expires. But, does it scale?
Can you load it more?
Of course.
[snip]
WAIT!!!!! WAIT!!!! I didn't say -anything- about
Alphas! I just said,
let me repeat a fact:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ F A C T ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
An Intel i486DX2/66 will run Dhrystone 2.1 2 to 3 times faster than a Vax 6500.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ F A C T ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Actually, what people are reacting to is the following:
On Sun, 24 Oct 1999 14:24:54 -0700 Mike Cheponis wrote:
Hey, I'm not saying the original IBM PC was going
to outperform the VAX 6500;
but a modern PC will crush any VAX in any application, IMHO, with equivalent
h/w attached.
PC isn't synonymous with Intel, when this statement was made it opened the
door to a discussion of machine architecture designed to show that the above
assertion was untrue.
Cheers,
Chris
--
Chris Kennedy
chris(a)mainecoon.com
http://www.mainecoon.com
PGP fingerprint: 4E99 10B6 7253 B048 6685 6CBC 55E1 20A3 108D AB97