On Oct 30, 2011, at 10:10 PM, Toby Thain wrote:
Has the concept of translating the source file blobs
of your codebase into blobs of object code then linking them all together into a
monolithic executable -- *every time you change the program* -- kept pace with modern
concepts of software development? I mean, that's a bit primitive even relative to the
state of the art in 1985*...
It's interesting you should mention that; years ago, Apple's XCode developer tools
(basically a nice IDE wrapping around a GCC toolchain) had a "zerolink" option
for development that didn't bother to link and instead treated each object file as a
dynamic library. It certainly saved a lot of time linking large programs.
It was nice for development if you had a program that took forever to link, but I found it
to be a bit of a pain in the ass for debugging because link errors (mainly undefined
symbols due to forgetting to include a .c file, which the compiler didn't catch but
the linker would) happened mysteriously at runtime instead of at link time. I'm sure
that could be solved by adding a separate (fast) pass just to resolve all references
instead of doing a full link, but Apple pulled zerolink out a while back anyway
(presumably because machines got fast enough that no one cared about the long link time
anymore).
It's certainly a paradigm I wouldn't mind seeing go the way of the dodo, but for
everyday use stuff, it's certainly a tradeoff in speed between the developer and the
user; if everything has to be linked at runtime (see: Java, which admittedly also has the
JIT overhead and manifold problems of its own), startup times for even simple applications
increase by several orders of magnitude, which drives average users up the wall.
- Dave