Chuck Guzis wrote:
A smart interpreter that kept its strings as
descriptor-containing
length items could run rings around the aforementioned compiler,
since the actual statement to be interpreted is likely quite small in
comparison to the size of the strings involved.
Cheers,
Chuck
Ah, but that is an implementation difference on the way an interpreter
and a compiler might handle strings.
If the compiler used a length descriptor strings then the
implementations would be equivalent, and the compiler should have the
upper hand?
The topic/question originally started as 'A compiler generally wins,
given all other things being equal'.
One of the problems with the discussion is that if you include the
runtime for the compiler, then you basically can match anything an
interpreter can do at run time. So the whole discussion might be moot,
and might just come down to 'my specific implementation of such and such
is better than yours.'
Mike