Eric J Korpela wrote:
A transistor is a
device that you buy because you can't make one yourself without
spending way too much money. A video card is essentially the same
thing. You can make one yourself. Some day we will be forced to in
order to keep our machines running. The main difference is that the
theory of operation of a transistor can be expressed in a 1 page
document. The theory of operation of a video card is a small book
that requires references to other books.
The fundamental problem is since the mid 80's you have rarely been able
to buy complete information to duplicate (or in some cases even program)
the hardware you are buying.
Once entire computers became a field replaceable unit, manufacturers had
no need to release the information any more. All it did was give the competition
an easy way to reverse engineer it.
Unfortunately, that's how people like me learned how computers worked; by reading
the field service manuals.
You don't think at that level any more, and the ability to put together a gigahertz
computer is done with a screwdriver, a couple of cables, and some screws.
On the other hand, now that people have a handful of common platforms, much more
sophisticated software is being written.