On 30/10/11 10:31 PM, Chuck Guzis wrote:
On 30 Oct 2011 at 22:10, Toby Thain wrote:
Has the concept of translating the source file
blobs of your codebase
into blobs of object code then linking them all together into a
monolithic executable -- *every time you change the program* -- kept
pace with modern concepts of software development? I mean, that's a
bit primitive even relative to the state of the art in 1985*...
No, but it's usually a good idea to recompile the whole thing
occasionally someone may have screwed up the makefile or even mis-
dated a file (it happens), particularly when a regression is
observed. It would be nice to know if it was going to take seconds
or hours... If seconds, why not recompile the thing as a whole all
the time? I'd really start to get worried if the result of a
complete compilation differed from an incremental one.
So would I. I'd probably try to fix the makefile. :D
I assume that most commercial software operations do a clean
recompile from scratch for new releases. So if you have a billion
lines of code in your codebase (that's 30 solid years of 24x7 coding
I don't think much commercial software weighs in at 1,000,000,000 lines
- yet. But you'd need an efficient build system if it did.
But this conversation is about what developers do every day. Release
builds would often not be built on the developer's machine.
--Toby
at the rate of one line per second), is it going to
take 10 minutes
(coffee break) or 10 days (dead from dehydration)...
--Chuck