On 30 Oct 2011 at 22:10, Toby Thain wrote:
Has the concept of translating the source file blobs
of your codebase
into blobs of object code then linking them all together into a
monolithic executable -- *every time you change the program* -- kept
pace with modern concepts of software development? I mean, that's a
bit primitive even relative to the state of the art in 1985*...
No, but it's usually a good idea to recompile the whole thing
occasionally someone may have screwed up the makefile or even mis-
dated a file (it happens), particularly when a regression is
observed. It would be nice to know if it was going to take seconds
or hours... If seconds, why not recompile the thing as a whole all
the time? I'd really start to get worried if the result of a
complete compilation differed from an incremental one.
I assume that most commercial software operations do a clean
recompile from scratch for new releases. So if you have a billion
lines of code in your codebase (that's 30 solid years of 24x7 coding
at the rate of one line per second), is it going to take 10 minutes
(coffee break) or 10 days (dead from dehydration)...
--Chuck