It's interesting and probably indicative of some mindset where a
discussion of the evolution of a given architecture is being discussed
that specific technical aspects are most often mentioned, even though
most of those are holdovers from the 1960s, just made smaller.
My take is "why were these advancements necessary"? In other words,
what parallel non-CPU/societal developments caused the shift in thinking?
I recall that when the 8086 was announced by Intel, it wasn't the first
16-bit CPU by a long shot, nor was Intel doing a hard-sell on it. 8-bit
in the personal computer still reigned supreme and the prospect of a
64KB 16-bit system costing considerably more than a similarly-performing
8 bit system wasn't particularly attractive.
What was the catalyst?
My .02 on the matter, FWIW.
--Chuck