On Mon, Jun 03, 2013 at 04:17:07PM -0700, Fred Cisin wrote:
Boyle's law predicts that software will expand faster than hardware, to
occupy all available resources, and then some.
:?) Interesting usage of it.
Alas, writing "efficient" software seems to be rejected on the grounds
that
1) "Modern compilers optimize better than a human possibly could"
2) "The upcoming round of machines will be so fast that there is no need
to bother with that stuff" (being efficient)
*sigh*
Whoever claims that urgently needs some enlightenment. As in "lets peel
back the foreskin of ignorance and apply the wirebrush of enlightenment".
Yes, _microoptimization_ on the single instruction level is _usually_
wasted effort, with good modern compilers mostly being much better than
programmers. However, large scale optimization, e.g. choosing a better
algorithm is a different matter. That is still (and will remain) very
much relevant unless you are only tackling trivial problems. Yes, if you
are only sorting a few hundred items on a current Intel CPU, it doesn't
matter if your algorithm is O(n^2) or O(n^3). If you are sorting reasonably
large datasets (more than few dozen million items), things look very
different.
When I taught beginning Data Structures And
Algorithms, I always got a few
students who rejected the idea of being efficient ("throw hardware at
it"), and specifically objected to spending an entire 3 hour class session
on how to create algorithms for sorting and searching datasets too big to
fit into memory ("just get a bigger computer and load it all into a single
array in memory!")
Idiots. Incompetent ones, at that. So when they encounter a dataset of
several terabytes (yes, those _do_ exist in the real world), they just
spent a couple million $ on a supercomputer to sort it? Really?
I've seen enough stupidity like that. "Oh, lets just slurp the entire
log file into memory and then iterate over it line by line" - works fine
until the size of the log file approaches a significant fraction of total
system memory. Switching to _reading_ it line by line dropped memory usage
by more than an order of magnitude and improved speed by a factor of IIRC
30+. Amazing how well things can work when one _understands_ what one is
doing ...
Well, the "just throw hardware at it" answer is a good way to fail a job
interview with me. Hard. As I happen to work for a company that routinely
works with ... non-trivial amounts of data[0], so efficient algorithms are
very much in demand.
Kind regards,
Alex.
[0] We have data storage systems that trigger emergency alerts when they
are down to their last few petabytes of remaining free storage.
--
"Opportunity is missed by most people because it is dressed in overalls and
looks like work." -- Thomas A. Edison