It was thus said that the Great Josh Dersch once stated:
Dave McGuire wrote:
On Feb 9, 2010, at 1:44 PM, Josh Dersch wrote:
Ok. Show me a processor that has an "object" data type, that has
subtypes like "member" and "method" and such. There aren't any.
Translating from such incredibly high-level constructs to registers,
stacks, and memory locations is not without a lot of overhead. Try to
envision what happens in the instruction stream during things like
virtual function lookups in C++, for example.
Ok, I'm envisioning it. A few instructions to do a vtable lookup and a
jump to the correct virtual function. Wow. So those extra instructions
are what's making every machine in the world (apparently) very very slow?
It's worse than that. Shared libraries also take their toll, at least,
for those systems that use the ELF format. The following article is rather
instructive for those not familiar with the dynamics of dynamic linking:
http://www.iecc.com/linker/linker10.html
So it's a tradeoff---dynamic libraries allow a smaller executable (the
executable for Firefox 3.5.7 on my system is only 57k, which surprised the
hell out of me, but then again, it loads about 42M of libraries) and less
memory usage overall (about 4M of shared libraries are used by other
processes) *and* allow a library to be fixed without relinking the entire
program, but at a cost of slower runtime (because shared libraries and
vtables use pretty much the same logic).
About the worst case I can see is a virtual call method into a shared
library, in the middle of a crital loop.
To mangle a quote: a few cycles here, a few cycles there, and pretty soon
you're talking real time.
And while I could load up RedHat 5.2 (last decent version of RedHat in my
opinion) on a modern machine (and yes, it would fly on today's hardware), I
would end up having to recompile X to use my video card, and I would have to
suffer Netscape 4, unless, of course, I wanted to try to compile Firefox. I
would also lose out on the C99 features of GCC (which I do use---C99 is a
nice progression), which means I would have to recompile GCC, etc. etc.
The other side of the coin is that the 2.6Gh machine sitting below my desk
can do things that my old system (a 150MHz 486 with 32M RAM) would die
trying to do. I can actually manipulate digital photos for one thing.
Another thing, I can solve jumbles *really fast*, burning through half a
million words in 0.18 seconds (heck, I have a version that does that in
0.006 seconds, but in order to to that, I use 15M of memory when running). I
could probably edit video on this box if I had the software to do so.
Couldn't even *conceive* of that on my Coco (which I love dearly).
Now, do I wish Firefox was faster? Heck, I'd take stability (it crashes
on a whim, but I have to use Firefox 3.5 for some of the stuff I do for
work) over speed, but ... yeah, speed would be nice.
Ah, anyway, enough rambling ...
-spc (
http://prog21.dadgum.com/ makes for some interesting reading [1])
[1]
http://prog21.dadgum.com/50.html
http://prog21.dadgum.com/45.html
http://prog21.dadgum.com/47.html
http://prog21.dadgum.com/29.html
But really, all the entries are good ...