My work has been using structural models, at the gate level, in VHDL
(Verilog would be fine, too, of course). Individual components (for
example, a piece of an IBM SMS card, or in my existing case, gates made
available to student engineers that were actually individual
gates/chunks of DTL chips) get little behavioral models. As I
mentioned, so far what I have done is reproduce and test a 12 bit
computer designed in an electrical engineering course on logic/computer
design. In August I plan on publishing my experience on a website.
I would note that I also see value in the behavioral approach, which
really would be considerably more detailed than what you get form SimH.
The IBM 1410 cycle-level simulator I have written is closer to what one
might get from a behavioral model, but even that is not quite so detailed.
Using the structural / gate level techniques, one does run into some
issues, most of which have (or will probably have) solutions:
1) R/S latches composed of gates in a combinatorial loop. The problems
this causes are several, including the latch getting folded into the
look up tables for gates which use the signal, and issues when one
brings such a signal out to an I/O pin to feed to a logic analyzer,
which can cause problems to appear and disappear. My experience is that
one can add a D flip flop after the RS latch. This typically works
because at 50 Mhz, it adds only 20 ns delay, which is comparable to gate
delays these old machines typically had.
2) One-shots. I haven't had to address this one yet, but I am sure
that I will. I expect that one can simply use a counter to handle it -
no big deal at all.
3) Flip flops which are clocked from combinatorial signals. These tend
to cause timing/glitch issues. For example, in one case the
combinatorial output was a zero-check on a counter. Since the counter
flip flops did not all change at exactly the same time, that signal
could glitch during the simulated machines master clock edge. They
respond well to the same general solution as #1 - stick a D flip flop
between the combinatorial output and the clock input. In the case I
mentioned, that gave the signal an entire 50 Mhz clock period to settle
down.
And of course, getting the detailed information one needs to develop
such a model can be a challenge. Fortunately for the older IBM
machines, IBM produced ALDs - Automated Logic Diagrams - which I hope
will generally have enough information.
My experience on FPGA forums during the development of my 12 bit
computer implementation was mixed. I got some helpful comments, but the
majority of folks were not helpful, and instead preferred to bash me for
not redoing the entire machine design using FPGA's the way these
particular folks felt was "the only right way" to use them. Bah.
JRJ
On 7/14/2015 2:58 AM, Dave G4UGM wrote:
-----Original
Message-----
From: cctalk [mailto:cctalk-bounces at
classiccmp.org] On Behalf Of Paul
Koning
Sent: 13 July 2015 17:03
To: General Discussion: On-Topic Posts
Subject: Re: Reproducing old machines with newer technology (Re: PDP-12 at
the RICM)
On Jul 13, 2015, at 8:35 AM, Jay Jaeger <cube1
at charter.net> wrote:
Another alternative would be to build a machine up from a Field
Programmable Gate Array (e.g., the Digilent Nexys2 FPGA development
board). I recently completed an effort doing that for a 12 bit
machine we designed and built in a logic/computer design class from
racks of logic interconnected using IBM unit record plug boards in 1972.
I am going to attempt to do the same for IBM's 1410 computer - a
really big effort.
That?s been done for all sorts of machines, of course; the PDP-11 comes to
mind.
One question would be what design approach you?re using. A behavioral
model is one option; that?s roughly SIMH in an FPGA. And just like SIMH, the
model is only as accurate as your knowledge of the obscure details of the
original design. Depending on the quality of available manuals, this accuracy
may be rather low. (For example, building a PDP-11 model if all you have is a
Processor Handbook may not be accurate enough.)
A different approach is to reproduce the actual logic design. FPGAs can be
fed gate level models, though that?s not the most common practice as I
understand it. But if you have access to that level of original design data, the
result can be quite accurate.
I?ve done a partial gate level model of the CDC 6600, working from the wiring
lists and module schematics. It accurately reproduces (and explains) quite
obscure properties of the peripheral processors, things that aren?t
documented anywhere I know of other than in programmer lore. It also
yields a large model that simulates very slowly...
paul
I think there are a several options for the degree of authenticity with FPGA
re-implementations. At the simplest of levels my Baby Baby runs at the same speed as the
full sized Baby, but it currently uses a 32 bit parallel logic in many places as I built a
32 bit wide store and it keeps much of the HDL "code" simple. I do intend to
try a full serial machine at some point, but its low on my list. I have only really use
the Xilinx ISE in anger, but I note it is possible to see the generated gates generated
from the HDL or simulated HDL from a gate level diagram. I believe you can also mix and
match gates and HDL (I have not tried, too many other things to do.)
My next project is likely to be the Ferranti Pegasus which is several orders of magnitude
more complex than the Baby and will need a proper plan.
Dave