> But is it
STORED PROGRAM, eg. can it do self-modifying code.
vonNeumann
It's a bit odd. It's got separate machine code program and data
memories,
but user keystroke programs are stored in _data_
memory, and
interpretted
by the machine-code operating system.
yes this occurred to me also, the Parallax BASIC STAMPs do the same
thing. It doesn't matter of course in most practical realms, this
stored-program distinction.
> Strictly-speaking, Microchip's PICs are
NOT COMPUTERS. Of
course I don't
Since when? AFAIK, Harvard architecture machines are computers.
Self-modification of programs is not a requirement.
Well, for rigor, I reserve "computer" for the stored-program machines,
but this is approaching religious behavior, I admit, and clearly a PIC
with an interpreter suuuure acts like one... :-)
Tom's point is well taken. However, I would argue that the historical
usage of the term _computer_ perhaps gives the best justification for
current usage.
The term _computer_ in the middle 19th century referred to "a person
who computes." That is, a person with a pad and pencil, a set of data,
and a specification of the values to determine from those data. The
result was the computation. Prior to the existence of automatic data
processors, companies employed huge staff, housed is great open
rooms, where desks invited hours of tedious computation.
That some here refer to _computers_ only in the case of the stored
program is to miss the point. In fact, the unit record equipment of
the early 20th century constituted a computer, its program being
provided by a plug board (a kind of storage - however abstract).
Moreover, there is the _analog computer_ with programming very
similar to the unit record equipment, and such machines have always
been known as computers.
The important point for computation is closure, and Turing is the
ideal model. It is not efficient, it is not pretty but, all systems that
exhibit computational closure are Turing machine equivalents, and
this is the foundation of computer science.
William R. Buckley