On Tue, Jun 22, 2004 at 01:55:33PM -0400, William Donzelli wrote:
(1)
Reliability is always more important. But memory/CPU cycles
cannot be ignored when your customers are running benchmarks, and when
you're trying to beat the competition using less expensive hardware
than they are.
Frankly, I don't think too many applications are ever tested for speed
other than "is it fast enough not to be a pain?". Sure, some applications
need speed at the forefront, but let's face it, most don't. There are
probably ten times the number of people writing programs that balance
checkbooks or are just dressed up interfaces for some other mundane
programs, to the one guy that is writing some speed needing game. When it
comes to testing these mundane programs for speed, typically the
requirement is "just don't make it slow".
There are speed issues in programs where one usually doesn't expect them,
like in MUAs (Mail User Agents). A while ago I was looking for a
suitable graphic MUA under UNIX for my SWMBO (myself being a loyal mutt
user, I don't know any usable graphics one offhand). Turns out that most
programmers these days seem to be in urgent need of a basic CS
education, elementary things like estimating the complexity of
algorithms (Big O notation) obviously being not known to them: all the
graphical MUAs performed reasonably well when facing small mailboxes
(less than 50 small, mostly plain text, mails) but most failed miserably
when being set to work with serious amounts of data. One of my
mailboxes, filled from a mailinglist, contains some 40000 mails and is
around 250 MB in size. Good old mutt copes nicely with this, needing
about a minute to read it, with the limiting factor being IO (NFS). And
this was with full threading ...
Most graphical MUAs I killed after they showed no progress after half an
hour of trying to load this mailbox.
Net result: all the micro-optimizing of inner loops won't help you if
your algorithm is some O(n^2) crap, the nasty thing with those being
that they tend to work well with the small datasets used by most during
development, while failing spectacularly when used with serious amounts
of data.
(2) Yes indeed
-- but being skilled at assembly
language programming imposes a useful discipline that carries over
into other languages.
Yes, but how useful? I don't think the industry thinks it is worth it.
As a side (and to give the hornet's nest another whack), shouldn't good
programming discipline be formed using Pascal? That is why it was
invented. You break the rules, it yells at you.
Learning the ropes of programming with Pascal (or even better Ada), i.e.
in what is usually called a "discipline and bondage" programming language
is _good_. It teaches one right from the start that sloppy cobbling
together of code is not acceptable. Having formed good habits, one might
be allowed to proceed to more dangerous programming languages like C.
Of course, Pascal is falling from grace as well...
Delphi and Kylix seem to have quite a cloud of users ...
Regards,
Alex.
--
"Opportunity is missed by most people because it is dressed in overalls and
looks like work." -- Thomas A. Edison