On 19 Mar 2007 at 12:17, Brian L. Stuart wrote:
True in the way Moore's Law is usually thought of.
But when
he stated his "law" Moore wasn't talking about speed; he
was talking about the number of transistors.
No argument there, but regardless of how you might want to slice it,
net bandwidth increases still don't correllate with any sort of a
Moore's law rate.
There is once class of user that's large enough
and willing
enough to fork out dollars that there's still a point in pushing
performance as much as we can. That's the gamers, the
guys that keep moving toward a day when their game is
so immersive and so realistic that they can't always tell
whether they're in the game or the real world. And it turns
out that some of the things we need to do in those games
do parallelize nicely. So I expect we will continue to see
some of the continued growing overkill of system performance
for uses like e-mail and browsing. But then the people who
took the web and twisted it from being a solid information
distribution world to an entertainment medium will find
ways to use those cycles.
I don't disagree with you there, but observe that the tendency in
gaming has been more toward the dedicated-use boxes, such as the PS3
and Xbox 360. It's simply more cost-effective to leave out all of
the extra stuff that folks have come to expect with a desktop/laptop
PC and repackage the result in a game console. That gaming will push
the processor-capability envelope is pretty obvious, but how much of
that will make its way to the desk- or laptop is a matter of
conjecture.
If a major player would launch a successful effort to get gigabit
ethernet into everyone's home at an affordable cost, I'll readily
agree that the whole matter of "how much computer is enough" is
subject to drastic revision.
Okay, enough OT rambling!
Cheers,
Chuck