It was thus said that the Great dynasoar once stated:
In fact, there are not many tasks in computing that I have found to
REQUIRE a fast, modern PC with tons of memory and processor. Keep in mind
that practically *all* of the tasks done on today's machines are exactly
those that were done 15 years ago on the simpler machines.
But it depends upon what you are doing. While in theory you could
calculate e to 100,000 digits using an Apple ][, it might take upwards of a
week for the results, and you couldn't use the computer in the meantime,
whereas on modern machines, 100,000 digits could be generated in under an
hour, and with the right OS, you could still work on other things [1].
And if you consider that a pretty bogus example, what about rendering
(which I used to know as ray tracing)? Or photographic (or just graphic)
manipulations?
Granted, there are plenty of things that can still be done on older
hardware, and in fact, I used to write a weekly humor column on a Coco in
the late 80s.
What *has* changed IMHO are the skills that the
average user now brings to
the interface. Click, point, drag, and drop require a lot less dexterity,
concentration, and skill than learning keyboard commands, or having to be
competent with an operating system in order to get it to do what you want
it to.
What also has changed is the speed of computers, and the amount of memory.
On the plus side, the increase in speed and memory has the potential to
allow one to manipulate more data in less time. Unfortunately, programs
(and operating systems) have bloated, which lessens the usefulness of fast
CPUs and vast memory.
Don't get me wrong, I like the older computers, if only because of what
could be done with 1K, 4K, 16K or 64K. And they are conceptually simpler to
understand and possibly even repairable. Not so with todays computers with
proprietary VLSI chips that do everything.
[1] Steve Wozniak wrote a program to calculate e to 100,000 digits on an
Apple ][, using highly optimized assembly, which took advantage of
mapping the problem to the 6502, and it still took 4 days. A year
or so ago I wrote a version in C (based upon the 6502 code and his
explanation of it [2]) and ran it on a 386-33Mhz running Linux and
had an answer in under an hour. My, how time progresses 8-)
[2] Byte, December 1981.