From the Wikipedia article (if you want to use it as a
standard) indicates
as has been mentioned here: that the term
"Mini-computer" is somewhat
marketing driven: "In a 1970 survey, the *New York Times
<http://en.wikipedia.org/wiki/New_York_Times>* suggested a consensus
definition of a minicomputer as a machine costing less than 25 000
USD,with an input-output device such as a teleprinter and at least
4K <http://en.wikipedia.org/wiki/4000_%28number%29> words of memory, that
is capable of running programs in a higher level language, such as
Fortran<http://en.wikipedia.org/wiki/Fortran>or Basic."
To me, that sounds a bit like the definition was for a "3M" workstation: At
least a Meg of RAM, a megapixel display, and 1 MIPS of computing power.
On Fri, Jan 24, 2014 at 3:20 PM, William Donzelli <wdonzelli at gmail.com>wrote:
I don't
think any one would or should assume DEC was the first. Whilst I
think many would agree that they made more than any one else and were the
most successful dedicated maker,
Well, except for IBM. Oh wait, they are not "minicomputers", they are
"midrange"...Wait, what?
See the problem here?
--
Will