I think the biggest change is our compute resources stopped going faster
in terms of raw cycles per second, and started going wider in terms of
parallelism. It's now commonplace for me to run workloads that can actually
use many CPU cores, and I'm starting to occasionally run workloads that are
so parallel, that a GPU is a more suitable resource. At the same time as
the surge in parallelism, there's also a focus on going greener. I think
the last couple years have been particularly transformative.
Scott
On Sun, Jun 9, 2024 at 4:31 PM ben via cctalk <cctalk(a)classiccmp.org> wrote:
On 2024-06-09 11:01 a.m., Chuck Guzis via cctalk
wrote:
On 6/9/24 08:40, Murray McCullough via cctalk
wrote:
> Intel introduced to the world the x86 processor: the CISC technology
still
with us.
So what has changed other than speed and upward development?
The Internet?
Really, it's always been my view that computational technical progress
has long been driven by communication. Without communication, the
microprocessor would largely be limited to commercial (e.g. CAD,
finance, accounting) and a few niche applications. Many of those
segments would be just fine with older technology.
To put it in another context, what use would most people find with a PC
that was limited to 300 bps modem communication?
--Chuck
More reliable. Can you trust a CLOUD?
How soon will your data be corrupted, or behind a paywall?