From: Dwight Kelvey
The RIS[C]/CISC is really not even relevant in todays
processors since
the main limiting factor is memory access bandwidth and effective use
of caches.
Memory bandwidth has often been the limiting factor over the complete
timeline of CPU's/systems. (It would be interesting to draw up a timeline,
showing the periods when it was, and was not.) Yes, caches can help a lot,
but inevitably they will miss (depending on the application, more or less
often).
The RISC/CISC thing actually is kind of relevant to this, because RISC
focuses on getting the CPU cycles to be as fast as possible, and that kind of
implies simpler instructions --> more instructions to get a particular task
done.
That was part of the motivation for microcoding, back when it was invented; at
that point in time, logic was fast, memories were slow, so more complex
instructions made better use of memory bandwidth - especially since this was
pre-caches. (It also made binary code 'denser', which was important back then,
with much smaller memories.) However, more complex instruction sets made the
CPU more complicated; microcoding helped deal with that.
The 801's breakthrough, at a very high level, was to see the whole system,
and try and optimize across the compiler as well as the instruction set, etc,
etc. They also realized that people had been going CISCy for so long that
people had to some degree forgotten why, and that that assumption needed to
be re-examined - especially in light of the then-current logic/memory speed
balance, which had shifted towards memory at that particular point in time.
Noel