Chuck Guzis wrote:
At one time async logic was a hot topic.
The IAS machine (von Neumann/late 1940s) is listed in various places (under
'clock rate') as being 'async'. (And - annoyingly - those listings then
don't
provide an effective instruction rate for the sake of comparison).
I've been curious as to more precisely how the timing was accomplished in that
(those) machines. Offhand, I suspect you still end up with delay elements in
the design at various points to ensure some group (worst case) of signals/paths
are all ready/stable at some point and you end with a more-or-less 'effective
clock rate' anyways and don't gain much.
Such all started with ENIAC didn't it?, which - based on what I've been able
to find/read - could be described as an async design.
Was async still being discussed in the 60's?