Andy Holt wrote:
In the older sense where fixed locations and
self-modifying-code was
acceptable (eg in the CDC6600 and most pre-PDP-11 minicomputers) then a
stack was either slower (because emulated) or required more CPU logic (and
thus cost more). The B5500 was indeed a very potent machine of its era ...
it was also an extremely expensive one.
Yes, and quite ironically, the thing that did in self-modifying code was
the cache. Ok, well pipelining did help drive that nail in the coffin
of self modifying code as well. :-) There are some interesting
optimizations that you can do with self modifying code that we just
don't have today, but overall, the benefits of caches and pipelines
outweigh them.
Nowadays any "reasonable" amount of CPU
logic essentially costs the same and
thus stacks are "free"
Exactly.
Incidently old COBOL programs were notable for the
almost complete absence
of function calls - and I bet many "dusty decks" from 30 years ago are still
being run daily with few significant modifications.
Sure, even today when you have a specific application in a device that's
very short on memory, let alone stacks, then you have no choice but to
write code that can't use stacks. So a stack architecture would be
useless there. Writing that code would also be an insane braint wisting
effort too.
I was going to stay out of this
"discussion", but felt that a little light
might moderate some of the heat. (However, I doubt it :-(
This may be shocking, but I don't personally care about that much about
stack machines. :-)
No, seriously, the thing that got my goat was the fuzzy thinking around
how one would have a magically fast machine if you had a 3-address ISA
and how it's important to do vectors and how a stack machine can't
possibly do vectors, and that unless a supercomputer has it, it's just
not important somehow. Stuff like this is just pure superstition.
(I'd normally use stronger language here, but I didn't want to get too
impolite.) :-D
For my own needs, I buy whatever reasonably fast machine I can afford
from the likes of AMD or Intel and slap either Linux of
a BSD on it
depending on the need. For day to day use, I've zero use for a
vector
engine, and zero use for a stack machine.
It might be fun to have a nice fast graphics card, which of course would
make great use of a GPU that did vectors, but, you know what, I almost
don't bother playing modern games at all. I may fire up MAME and play
PacMan or Defender until I'm bored, but that's about it. I did go and
buy a Playstation 2 back in the day, but after about 6 hours of play,
it's been gathering dust for the last 3 years.
GPU wise, I might make an exception for OpenCroquet, but beyond that,
vector engines or GPU's aren't all useful to me at all.
For the stuff that I do, I don't require register windows, stack
machines, vector units, GPU's. FPU's are nice, but I don't find myself
doing very much requiring them. If they're included in the box I buy,
great! Wonderful! Won't make that much of a difference.
On the other hand, if I'm recommending a machine professionally, I
certainly take into account what their code does, how it works, what
resources it needs, and what can make it run faster, and pick
accordingly to those specs and their budget.
Yes, I do have a very nice set of SPARC and UltraSPARC machines, as well
as an Indy and HP-PA, and Eo's, various Macs, and a bunch of beautiful
classic 8-bit machines, but day to day, stacks don't make a bit of
difference to me. :-)