It was thus said that the Great Richard Erlacher once stated:
If you were thumping Billy Gates and his friends by hinting that they simply
bought something and filed off the rough edges and nibbled at the edges
where it didn't fit, well, that's legitimate business. You might not see
that as innovation (just to bring this discussion back to where we left the
original thread) then keep in mind that those of us who use the software
every day care a lot more about whether it works than whether it is totally
new. It would be truly innovative to have software weenies get used to
writing the code just as it was specified and leaving out every extra "hook"
or whatever that wasn't in the spec, hence won't be accounted for in the
doc's.
Just for the record, and speaking as a programmer, I would find it truely
innovative if software was actually DESIGNED for a change, instead of hacked
on. I count myself lucky that I don't have to work with other programmers
or their code since I hear way too many horror stories from my buddies that
actually have to (and it's almost always worse when it involves anything
dealing with Mircosoft [1]). The last time I had to work with someone
else's code it was so horribly written [2] and obviously not designed that I
scrapped it and rewrote it.
I even talked to the original programmer and his attitude was basically
``Hey, the customer paid for it and it works. I'm not working on it again.
Screw you dude!'' It's rare when I want someone else shot on sight, but he
was one of them.
Spec? Spec? That's a good one.
It would be truly innovative to have it tested
systematically with a wide
range of inputs, both random and systematic and to great depth to make sure
that there's no way to make it go down the toilet, especially not by
inadvertent input at the "wrong" time or place. I'd say that at a minimum
it ought to be tested with every possible input of a single character at
times when input is not expected and every combination of 2^32 chacters at
every point at which input is anticipated. The software should behave in an
entirely predictable way and give messages that are totally self
explanatory. Do I really think American programmers will ever take their
work that seriously, . . . No, but they should at least think about it,
don't you think? It could be done automatically, though, couldn't it?
First we should hope that the pro's learn not to write memory leaks, eh?
First off, European programmers are no better. At least when we write
unmaintainable piles of crap, it's because we don't know any better.
Europeans they seem to just love writing complex code just for complexity's
sake. A friend of mine (B.S in Comp Sci from an American university) had to
explain to three German programmers (each with a Ph.D. from a European
university) why their code was not re-entrant, and show an easier way to do
what they wanted to do in the first place. They refused to make the change
(which only impacted five places in the code) since their code was
``tested'' and obviously worked. Uh, no. It wasn't re-entrant. In that
particular case, that was a bug waiting to happen.
Although I
might be more inclinded to remove redundant calls (that differ
only in device) but that's a step down the Unix direction.
That's a feature that came about when one fellow wrote one version and then,
having become the boss, expected someone else to pick up where he left off,
but not damage any of his now-sacred work.
Well, it might also be a conscience design decision now that I think about
it. On a register starved 8080 it might not have been possible to specify a
device number, or might have increased the code size to load an extra
register. It's hard to say.
But what you say is the current Microsoft way from stories I've heard.
Not
outright. And it's not like MS-DOS used that as the ONLY way to call
the operating system---the preferred method was still INT 21h. I suspect it
was only put in to help migrate software from CP/M to MS-DOS.
Now there's an interesting notion! A programmer actually did something to
help out the struggling programmers who were going to have to live with his
work. How thoughtful! You don't suppose those not-so-glaring similarities
between the two OS' might actually exist because the authors thought it wise
to make the environment familiar, or at least familiar-looking.
That's entirely possible.
This whole conversation started because you stated you didn't see the
similarities between CP/M and MS-DOS. I came along and said that the
similarities were lower down, below the user stuff. I never said that was
good or bad. Other people seem to think Microsoft (or Tim Patterson) was
wrong doing what they did. I haven't.
-spc
(Amature computer historian ... )
that's AMATEUR, you HISTORIAN
Thank you.
WIth few exceptions, this tracks the PBS "revenge
of the nerds" account of
these events pretty closely. The thinly veiled truth must be in there
somewhere.
Well, I never saw the PBS ``Revenge of the Nerds'' but I have read several
books (and countless magazine articles and reference materials) but I'm
interested to hear what you think the truth is.
-spc (AmaTEUR computer history then ... )
[1] For instance, a friend of mine is writing a program to automatically
restart programs if they stop running. Under Unix, this is already
provided by the init program yet it's something that is lacking
under Windows NT (which can run as a server, mind you). So he's
basically writing a clone of init for NT. The problem is that he
not only has to restart programs, but NT services, and that requires
an entirely different set of API calls than starting and stopping
programs (not to mention that you can get a signal when a program
dies, yet you have to poll to see if a service is dead).
The project he's on is a complete disaster as the manager went for a
Microsoft solution using slews of programs communicating via COM,
DCOM, OLE and other alphabet soup of Microsoft technology. A year
later and it still doesn't work and my friend has basically told the
manager it has to be scrapped and done from scratch, preferably
using something other than Microsoft (although my friend might have
a slight bias).
[2] In C with random indentation, C++ style comments (which his C
compiler accepted) and race conditions that would cause it to
fail under certain circumstances.