On Mon, 3 Jun 2013, Fred Cisin wrote:
> 80386, oddly, is far from "long
dead".
On Mon, 3 Jun 2013, Liam Proven wrote:
It is in any market I *ever* see.
depends on your definition(s) of "long"
Is this Pentium running XP not really a 386 at heart?
Seems like there are quite a few around.
When the 386 first came out, it was treated as a fast 286, without using
it's full cpabilities.
BUT, eventually, 386 became the standard.
When 486 and Pentium came out, they were treated as fast 386s, and seemed
to stay that way for a very long time. Isn't that what "I386" means?
Which versions of Windoze will not install on a 386? (never mind issues
of "adequate performance"!) (386-SX is out just due to its 16M RAM limit)
It's not just Linux V Microsoft on update philosphies.
Intel tried to maintain 4004 compatability. Every new chip was just
kludged patches to the previous one. That meant that they could come out
with FREQUENT revisions and "new" chips. The "new" one could replace
the
old one. Admittedly, not necessarily plug-in. Some would require
modifications to the old board design, but never relearning from scratch.
Software might require updates to handle new features, but never a
rewrite from scratch. Windoze Blue-Screen is still CP/M, with stuff
wrapped around it to hide it. The BDOS is bloated, and the CCP is . . .
Each chip has "historical" oddities inherited from its predecessors, and
"could have been" a lot better if they cut the cord.
At IBM, the 5170 was little more than an update to the 5160, and all PCs
since have continued the "minimal change to get maximum compatability"
philosophy. There are changes. ISA, MCA, EISA, PCI, USB, ATX, but the
heart is unchanged.
That's what bothers me about the x86 architecture...it gets such a kludgy
mess due to tacking it all on. Intel EM64T: it's amd64 so 64-bit with
32-bit x86 which is tacked on to a 16-bit architecture which is tacked on
to an 8-bit architecture and so on...
OTOH, Motorola took the opposite approach. When they created the 6809,
they designed from scratch, to build "THE BEST 8 bit processor"
(arguably succeeding), and nothing previous was compatible. All board
designs and ALL software, including OS, had to be done from scratch.
It was definitely NOT "too little"!, but it may have been "too late",
the
Z80 was so firmly established, along with CP/M, that they had difficulty
breaking into that market. From what I understand, they handed the Coco
design to Radio Shack, and that was the only "major" entrant into that
market.
On the 68000, they abandoned all chance of ANY compatability, to build
"THE BEST 16 bit processor" (arguably succeeding). Not "too little",
but
almost "too late". X86 and MS-DOS dominated the market. Fortunately,
Jobs' "clean room" design team for the Lisa had no "real-world"
experience, and no clue to even consider working from old designs, and
using any old software. That was one of Jobs' major goals. After the
fiasco with the Apple /// that put Apple on the rocks, he was determined
that the new design would not be tainted by ANY knowledge of anything
previous. So, they picked a processor from spec sheets, and the 68000 was
the clear winner. They designed from scratch. When it came to software,
they also had to design from scratch, there did not exist any 68000
compatible, nor even easily portable code. They didn't even know to
CONSIDER compatability. Even the drives had to be different, with an
extra slot to make it easier to put thumbprints on the media.
The "Maserati of the mind" was a design team dream. priced accordingly,
and with no place to carry a bag of groceries.
Fortunately, (after Jobs was out), the Mac was designed as a newer version
of a Lisa (and cutting the corners necessary to get it into the price
curve, albeit near the peak - does Apple still hold all records for
highest manufacturing mark-up?)
Nope. I think that's monster cable and similar. $1 cable sold for $50?
;)
It at least costs /something/ to make an apple product.
Motorola got smart with the Power-PC and provided superb emulation to
maintain full compatability. Why did the promised "Intel emulation" never
catch on?
No idea. That could've been neat.
So, . . .
you can be constrained by the past, but never have to start over.
Or, you can start over, and not have the inherited limitations.
Certainly a middle ground is POSSIBLE, but, we tend towards extremes.
--
Grumpy Ol' Fred cisin at
xenosoft.com
--
Cory Smelosky
http://gewt.net/ Personal stuff
http://gimme-sympathy.org Experiments