It was thus said that the Great Michael Sokolov once stated:
Brad Parker <brad at heeltoe.com> wrote:
As you know, the size of your design dictates
what tools you'll need to use.
I don't see what does size have to do with it. If a tool can compile a
traffic controller, it can compile a full CISC CPU.
But how long would it take?
Just this past week, I came across an old program I wrote in the early 90s
that would calculate maps of a particular pair of functions my boss [1] was
interested in [3]. This particular program took about a year to finish the
data sets on an SGI Personal Iris 4D-35 (33Mhz MIPS system)---each data
set took around 10 to 11 hours to generate (and there were *a lot* of data
sets).
Just on a lark, I took the code and tweaked it a bit (to run only 1/500th
of one dataset) to see how long it would take now, over a decade later.
On my 500MHz station at work, each dataset would take 2 hours, and an
entire run (which took the better part of a year when I did it lo' these
many years ago) would take 41 days to complete. I then did it on a
quad-Pentium 2.4GHz machine we have at work and the results were ... um ...
humbling: 45 *hours* to do an entire run.
Two days.
For something that took about a year a decade ago [4] (and the entire run
only covered about 1/4 of the potential space the equations cover).
-spc (And that's on a single machine mind you ... )
[1] Ph.D. in psychology, working out of the Math Department at the
university I attended [2]. I worked for him for four to five years,
managing his computers (all two of them) and writing the occational
program. Sweet gig and so far, was my favorite job I've had yet.
[2] Florida Atlantic University
[3] The equations are:
x' = ((A * y) + B) * x * (1.0 - x)
y' = ((C * x) + D) * y * (1.0 - y)
Repeat for N times and plot the results. It's a chaotic system and
would either settle down into a strange attractor or never settle
down at all. The values A, B, C and D had a certain range, and if
tweaked, would produce a different picture.
I forgot what the equations actually stood for (something to do with
neurphysiology or something like that), but then again, I never
understood about half of what my boss was saying anyway.
[4] Once I got all the datasets, I then spent the better part of a week
making a image from each data set, then recording each image to a
VCR hooked up to the SGI (had a $8k board on the machine that could
control a VCR, only we lacked editing software), then taking the
tape and re-editing it on actual video editing equipment to make a
smooth running film. Interesting for about the first 40 frames,
then a dull monotony for the rest of the week.