Hi,
You forgot the 432, which was very revolutionary in
design and
contemporaneous with the 8086. Which shows you how far dramatic
innovation will get you.
Didn't forget....just not aware of, LOL.
No apologist for Intel, I think that context is
important here. At
the time that the 8086 (and 8088) was deployed, there was already a
large body of x80 software....
Actually, the book I read around 1982 about programming the 8086 (which is
the only assembly language book I've ever disposed of BTW) went to great
lengths to explain why the 8086 is, essentially, a 16-bit 8080.
Whilst I fully agree that the reasons for the way the 8086 was designed make
excellent BUSINESS sense, in terms of real technological progress it was,
IMO, a very definite side-step. :-(
But my point is, all of the Intel processors I've had direct dealings with
(from a software/hardware design point of view) have been very backward in
their overall architecture; though that said, the 8050 family are damn good
at what they were designed for (if somewhat rough around the edges).
I really was a booster for the 68K--and programmed for
it. But no
one ever represented that there was a simple and straightforward way
to translate x80 assembly to 68K code....
I also did a *LOT* of 68k assembly language programming back in my ST days,
was a lot of fun.
With the exception of 8080 -> 8086, I can't offhand think of any examples
where converting source code from one architecture to another would be
simple and straightforward.
....nor was it clear if it was going to be simple to
use x80 peripherals
with the 68K.
True, but couldn't 6800 family peripheral chips be made to work with the 68k
fairly easily? If you're going to have to re-write the code anyway, driving
different peripheral chips shouldn't be too difficult (unless you're using
Z80 peripherals with vectored interrupts, DMA etc).
Zilog had trotted out the Z8000 at about the same
time, but it wasn't
clear if they were all that serious.
IKWYM. I don't recall much about the Z8000 now, but I did read the data
sheets for it somewhere around 1984 (1983?).
About all I remember is that it was a strange, illogical design. And that my
impression was that it wouldn't catch on (ironically, mainly because of the
68000).
So you saw a dichotomy--existing CP/M applications
that were run on
the Kaypros and Osbornes made it into the 8086 world....
Quite, when I built my first PC in 1990, I ran the MS-Dos version of
Wordstar until I finally gave in and switched to Win95 aroud the end of '96.
In fact, ISTR using it regularly on my ST in the late 80's using a PC
emulator.
Fun times.... :-)
I don't really blame the guys in Boca Raton for
choosing the 8088.
There was software for it, but you keep costs down with an 8-bit bus
and use commonly-available peripheral chips. Had the the 68K been
chosen (and it was a strong rumor, particularly after the IBM 68K-
based lab computer was announced before the 5150), it might never
have been as successful.
Actually, I remember reading an interview in an issue of Byte some years
ago, where the leader of the PC design project was quoted as saying that his
original choice for the processor was the 68000.
However, they were forced to switch to the 8088 because one of Motorolas'
two 68k fabrication facilities had gone bad, and Motorola couldn't guarantee
delivery dates for the processors. And since they were also very pushed for
time, and had experience of the Intel architecture from a previous
project....
I'd like to think that if the 68k had been chosen we'd have had far more
advanced PCs far quicker (I'm thinking back to the many 68k implementations
of UNIX amongst other things). But I do take your point about the
availability of software, let's face it, the *ONLY* reason I switched from
my Atari ST to the PC was software.
Although....if we'd had 68k/UNIX based boxes on our desktops instead of
8086/DOS who's to say that there wouldn't have been a similar explosion of
software?
TTFN - Pete.