On 2011 Jan 23, at 5:50 PM, Evan Koblentz wrote:
I drew up this
years ago when I was reading more about these issues:
http://www3.telus.net/~bhilpert/tmp/conceptsmachines.gif
although I don't think I was the first to arrive at such a diagram.
Of course there are various improvements that could be made. And I
suppose there may be arguments about the arrow from the ABC to ENIAC.
I admit that your ABC page is over my head, technical-wise. So, I
asked Bill M. for his opinion. Here is what he said (with permission
for me to quote it here):
----------------------------------
The [Atanasoff] machine had several operations, which
were initiated
by pressing the appropriate button. Two of them were decimal input
and output, and just one that did arithmetic.
This is not correct. The decimal input and output operations did binary
<-> decimal conversion. These involved a loop with repeated
full-register additions and subtractions (arithmetic) and sensing of
the data state (data-sensitive) along the way. This is shown in the
flowchart. The other operation was pair-elimination which was a similar
loop.
Calling the arithmetic operation
"programmable" and assigning it an
adjective like "data-sensitive" is a stretch; if you use those terms
so loosely, you will have to say a mechanical adding machine also has
those properties.
To be accurate I did not call or claim it to be programmable and I too
would be reluctant to do so. I said there was some flexibility in the
internal programs and I agree it was limited. The jumpers mentioned
below actually select which register the sign and zero detection would
be performed on in the loop.
Dismissing the points by equating them with mechanical adding machines
is flippant. No, mechanical adding machines do not have the properties
mentioned, however mechanical machines with built-in and multiplication
and division were complex machines. More below about M & D. Sometimes
one has to look at underlying concepts rather than common surface-level
perceptions.
There were 30 separate adders and 30 pairs of coefs.
The user had to
select which pair was to be eliminated. This selection happened by
the human moving a jumper cable from one set of holes to another.
Selecting one of the 30 for final output happened in a similar way, I
believe. The I/O for cards happened in groups of 5 and had real
switches.
But except for selecting an input, there are no "programmable options"
on the arithmetic. There was a switch you could set, so that it
would either start with a subtract or start with an add. (Once it
started it would flip back and forth.) The user has to know which way
to set this by examining the signs of the coefs. I wouldn't call this
a programmable option as much as a hazard. If you set it the wrong
way, the machine runs forever and never finishes. (Well, not forever,
it will definitely finish in 2 to 50th seconds. 35M years.)
--
As to the other part, data sensitivity, this is
somewhat true. The
machine is doing a "long" division by repeated adding and subtracting.
In the end, the result is just a simple math operation, but the
length of time it takes is dependent on the actual numbers you plug
in. It is literally just testing if the sign bit flipped, and
throwing a relay each time it does. I guess you could say every
multiplier is data dependent.
This is largely correct, but dismissing the ABC on the above is again
flippant. Multiplication and division have been implemented as small
programs throughout the history of computers, be it in a hardware state
machine, microcode, or (frequently) as instruction-level programs as
many machines do not provide those ops as instructions. Various 'real
computers' have gotten by with little or nothing more than sign- and/or
zero-detection for data-sensitivity/conditional operation, just as the
ABC provides.
One of the first three programs to run on the Manchester Baby (first
stored program machine) was Turing's long division routine. The other
two were factoring routines and no more complex. Routines like this
were enough to exercise the machine and show it could do practical
work.
The relative scale of the projects should also be kept in mind, ENIAC
was a large well-funded project, ABC was a small two-man project.
Despite it's small size, the concepts embodied and implemented in the
ABC were significant. ENIAC was much larger but which machine actually
embodied more original concepts is not obvious to answer.
I don't like to overstate what the ABC did, which is why I prepared
that web article about it, to try to put down concisely what it did
technically. I think some have overstated what it did it on occasion,
but it is also about time the ENIAC supporters acknowledged what the
ABC did do, rather than just trying to dismiss it. I understand there
are the familial/emotional issues of having Mauchly's name dragged
through the mud and I can sympathise on that. And I understand there
may still be differing opinions on how much influence Mauchly obtained
from Atanasoff. But the technical assessments need to
be separated from
those matters.
If one were to take the same attitude I perceive from ENIAC supporters
in their dismissal of the ABC, one could just as easily dismiss the
ENIAC: it wasn't a stored program machine, took hours to change a
program, wasn't a universal machine, could only operate on decimal
numbers, could not do general symbolic logic at least with any
facility, had next to no memory, couldn't do the things the machines
that came after it could, was an architectural dead-end, was just an
overblown calculator, it simply wasn't a 'computer'. The ABC had an
architecture closer to the stored-program proto-design EDVAC than did
ENIAC. Computing history began with the stored program machines, Turing
laid the theoretical basis for stored program machines, the Manchester
Baby was the first one to run, the EDSAC was the first one to provide
real service. And those were all British developments. So there. (..and
I'm not British.)