Since when are one-shots are a problem, they have
there place.
Thinking back more; most timing was asynchronous that needed
a lot set/reset flip flops and delay buffers to keep things in sync.
It is tricks to save a gate or two that is the problem.
It is not "the IBM way". Back in the 1960s and 1970s, IBM strongly tended
to have everything clocked, sometimes using rather impressive circuitry.
The clock generator of an S/3 is astoundingly complex, and gets around
some of the uncertainty involved with one-shots and delays.
I suspect a hard code sequencer.
No, it was a real microcode machine. It could run a set of complicated
diagnostics on a disk, without bothering the channel. Real front panel, too.
William Donzelli
aw288(a)osfn.org