Tony Duell wrote:
Yes, back when engineers actually thought about things
and didn't attempt
to 'solve' problems by throwing computing power at them.
-tony
Tony,
I sat on this for a few days since I try not to take the bait, but I
finally couldn't resist. If it was just this one comment, I'd let it
go, but you frequently post messages dripping with disdain like this one.
As an engineer with more than twenty years of experience, I found what
you said insulting and misinformed.
When I started in 1985 when a 6 MHz 68020 Sun machine was hot stuff, CPU
cycles weren't really all that free (although the old timers on the list
will laugh at that). Now I work at a company with a farm of 5000+ CPUs
running at 3 GHz, each with 4 or 8 GB of DRAM each (more for the subset
set aside for large jobs). There were crap engineers when I started and
there are crap engineers now. I'm sure there were crap engineers well
before my time and will be after I retire. The flip side is there are
very many very smart people still in the trenches creating new work.
To use a very broad brush and claim engineers today are inferior for
making rational decisions regarding resource allocation is just
misguided smugness. Can you imagine routinely designing 50 M gate chips
in 12 months without having significant CPU resources to aid in the process?
The heroic engineering of yesteryear that you so admire, and rightly so,
is no longer applicable. The constraints have changed, and so have the
methods. Attempting to lay out a modern chip by hand would produce
something riddled with errors; it would take forever to complete, and in
the end it would be much larger than the one using CAD tools.
Relying only on hand checking for programs of real complexity is
impractical. I suppose every time I change my verilog code I could
spend a day or a week contemplating it to foresee the unforeseeable, and
after I got fired for being unproductive, I'd have a lot more time to do
things that way. No, it is far better to do the best you can in a
reasonable amount of time and let a dumb and fast pile of CPUs run
regressions after any changes. At more than one company I've worked for
it is policy to run a small regression after any change even if the only
thing that has been changed is a comment.
It is important to make a distinction between science and engineering.
Scientists try to compute things exactly as possible; engineering is
more about having the wisdom and experience to know that although it is
possible to compute something to the 15th digit, for the given
application, 6 digits, say, is enough. Engineering fundamentally is
about cutting corners; engineering is pragmatic.
Here is a thought experiment. If the designers of the <insert name of
some machine you admire> had access to the resources of today with the
constraints of today, do you think they'd still go about it the same way?