Toby Thain [toby at telegraphics.com.au] wrote:
On 30/10/11 12:09 PM, arcarlini at
iee.org wrote:
There must be plenty of projects where a
top-level build
takes quite a
while.
Sometimes due to incompetent build systems (in my
experience). I am sure
you've seen this Makefile more than once:
exe : <about two hundred .c files>
$(CC) -o $@ $^
Ours is just big.
That's a thoroughly valid exception, indeed.
The embedded/dedicated hardware stuff isn't an exception - it's most of
my career :-)
I agree that *testing* an end-user application on the least capable
acceptable hardware is a good idea. Even there though, I
don't see how
forcing the developer to use that hardware for
development achieves
anything other than slowing them down.
Then how would *you* solve the perennial problem of software whose
pointless sloth and bloat won't allow it to run well on
anything but the
latest hardware?
My solution is in the bit you conveniently quoted above :-)
Despite being the consensus here, it doesn't appear to work that well.
If this were a problem for the majority of customers, then the market
would solve the issue
(i.e. someone else would release a competing product that didn't have
this issue but otherwise
worked at least "well enough").
So either the majority of customers do not see this problem or do not
see it as a problem or
the market is in some way broken. I'm not sure which of these is
responsible, or even if it
is a combination of these factors. I do know that if a business finds
that application $FOO
just won't run on whatever desktops they have, then almost certainly it
seems to be cheaper
to replace those desktops with a new ?500 dual-core desktop. (I say that
even though the
environment I work in develops using Linux-based systems and we're happy
to "roll our own"
whenever it is required - most places just don't work that way probably
because most places
are not development shops).
Antonio
arcarlini at
iee.org