It was thus said that the Great Jules Richardson once stated:
I think the problem there was that the software guys saw that people were
buying lots of hardware because it was cheap, and bloated their code out to
match :(
Well, it was in the late 60s/early 70s when the cost of hardware dropped
below the cost of software (or software development) plus the fact that the
speed of computers was doubling about every 18 months or so. There was a
study done several years where the conclusion was: if the program you are
writing is expected to take over 18 months or so, it was cheaper to just
wait a year before running the program and have it finish six months later
(or that was the break even point).
I remember back in 1992 writing a program that took a solid year (running
on a SGI Personal IRIS 4D-35) to finish. Last year I took the code, fixed
it up some to run on a modern Linux system (a 4-core CPU system with gobs of
memory) and found out that the same dataset only took I think two hours to
run (the code itself lent itself to parallelization quite nicely).
But yes, there is always pressure to get the software out "now" and
efficiency [1] be damned, because time to market is more important. Not
speed. Not security. Time to market. [2].
-spc (The trend now *is* using multiple computers, for both speed
and redundancy issues ... )
[1] The "efficiency" here is development dollars, not "execution
efficiency" or "memory efficiency".
[2] "Nobody pays for bug fixes. Everybody pays for features."
--Microsft, paraphrased.