I understood that the lack of virtual memory was a
conscious design
decision. If you want the best available memory throughput, the last thing
you need to do is add an additional layer of memory translation and even
worse, page faults. If you want to process large datasets either buy more
memory or decompose your problem (neither of which are easy or cheap...)
It was a very conscious choice he made, and for many applications, was
never missed. It did, however, limit the applications that could be
tackled by the machines.
The Cray-1s biggest rival at the time, the Cyber 203/205s from CDC, did
have virtual memory, but not quite as we know it. Once a big vector was
started from memory, there was no stopping the machine - I do not think it
would page fault in the middle of a vector load or store.
Crays are short vector machines, keeping them in registers. The Cybers
were long vector machines, with up to 64K elements, and kept them in
memory. Depending on the problem, either machine would beat the other.
William Donzelli
william(a)ans.net