It was thus said that the Great Sam Ismail once stated:
On Fri, 16 May 1997, Jeff Kaneko wrote:
Hrrumph!
It's amazing that the _Journal_ still exists at all! It's (original)
philosophy is the complete antithesis of current industry practise.
Used to be, if you could write small, fast code, not only were you
good, you survived!
Nowadays, its "A couple of meg here, a couple of meg there,
pretty soon we're talking about real memory usage".
Its totally fricken pathetic, isn't it? I remember when whole operating
systems resided in less than 64K. Now, you need 64 megs for all the
pretty GUI shit and sound clips. Totally pathetic.
But you can do more with more memory, in less time. Now, it's easier to
run, for instance, statistical analysis on large amounts of text and use
that to generate interresting, if somewhat bizarre, text. Or edit large
images, or even do sound manipulation.
I remember in 1985 or 86, programming my Coco (with 64K of RAM - upgraded
myself from 16K) to do a 6-bit sound sample. I could get a reconizable
sample of 30 seconds (each sample took 32k), and only reconizable because I
knew what to listen for. A decent sample lasted only 2 or 3 seconds.
Still though, that doesn't excuse the bloat of today's operating systems.
Just wished that the operating systems could reside in under 64K (actually,
the kernel for QNX on a Pentium weighs in under 10K).
-spc (Sigh)