Your argument, Eric, was that the microcode compiler
generated code
that is equally as efficient as that you, or someone else, could have
constructed by hand. Megan in no way implies the use of assembly code.
The microcode compiler would generate an object file, which by your
own admission above, generated more code than could fit in the
memory space available. You accepted her argument that the human
was required to generate code more efficient than that produced by
the microcode compiler. You protest _too loudly_ my friend.
No, I accepted her argument that for conventional machine code compiled from
a conventional high-level language, a human can fairly easily generate
better code. But if you had read my posting *carefully*, I specifically
protested that this is *not* the same problem as compiling horizontal
microcode from a specialty source language.
I *still* stand by my statement. The compiler produced better code
in minutes than I could have produced in three months. Your argument seems
to be that a compiler can't produce better code than a human with an infinite
amount of time could. I'll concede you that point. Or maybe I won't. A
compiler with an infinite amount of time could have simply tried every possible
combination of control store bits (for the 512*72 example, 2^36864
possibilities), and run a set of test vectors against each candidate to
determine which ones meet the specifications, and of those which yields
the highest overall performance. And by applying some relatively simple
heuristics, the number could be reduced from 2^36864 down to a number that,
while still huge, could at least be done during the remaining life of the
universe. But this is irrelevant, because neither the human nor the computer
has an infinite amount of time available.
If my job had depended on finishing the project in question without using
the compiler, the only way to do it would have been to expand the control
store to 768 or 1024 words, because after spending a lot of time writing
microcode by hand, it would probably have been larger than 512 words.
It was the use of the compiler that allowed me the luxury of shrinking it to
fit in the 512 words available. Without using the compiler, there is no way
in hell that I would have had time to do such a thing.
It is instructive to note that when I was trying to squeeze the 514 words
down to 512, I discovered that the compiler had succeeded in combining
several things that I wouldn't have easily found, because the compiler is
actually *better* at doing data flow analysis than I am. That's not because
the compiler is inherently more clever than I am, but because it is not
subject to the Hrair (sp?) limit as I am. It's not more clever, but it's
more tolerant of doing tedious recordkeeping and matching. Of course, if I
had the time to meticulously do the same thing, I obviously could do at least
as good a job of data flow analysis as the compiler. But in practice that's
simply not going to happen. Life's too short.
Most everyone in this discussion is just parroting the conventional wisdom
that compilers don't generate code as compact or efficient as humans can,
without considering the possibility that for specific problems and under
specific constraints, they actually can be *better*. I'm absolutely willing
(and eager) to concede that in the general and unconstrained case, the
conventional wisdom holds true.