On 30/10/11 12:09 PM, arcarlini at 
iee.org wrote:
  Toby Thain [toby at telegraphics.com.au]:
  Are there common things that a programmer waits
for? 
 Compiling a few directories takes seconds, but linking takes
 a little longer. Perhaps 30s on my 24GB system at work but
 more like 2-3m in its previous 8GB incarnation.
 There must be plenty of projects where a top-level build takes
 quite a while. 
Sometimes due to incompetent build systems (in my experience). I am sure
you've seen this Makefile more than once:
exe : <about two hundred .c files>
     $(CC) -o $@ $^
 In our case giving developers slow machines wouldn't help the
 final product since it runs on specific hardware. That must
 be true of a fair chunk of embedded development today (network
 boxes, phones, consoles ...). 
That's a thoroughly valid exception, indeed.
 I agree that *testing* an end-user application on the least capable
 acceptable hardware is a good idea. Even there though, I don't see
 how forcing the developer to use that hardware for development
 achieves anything other than slowing them down. 
Then how would *you* solve the perennial problem of software whose
pointless sloth and bloat won't allow it to run well on anything but the
latest hardware?
It is hard to dispute that a) older hardware gives the developer better
visibility to performance issues (it changes their priorities, even
subconsciously), and b) many developers do not need the "latest"
hardware at all. Why would it slow their *thinking* down? They should be
doing more thinking than typing, and a slow machine doesn't slow typing
down either...
The thing is: Make something run acceptably on the bulk of machines out
there (which are not cutting edge) and you improve things for those who
want to run the latest gadgets, commensurately. Looks like a win-win to me.
--Toby
 Antonio
 arcarlini at 
iee.org