I don't think it's fair (except for the not washing
the boards properly part) to say this was badly
engineered. There might have been cost issues that we
can only guess about. I worked at company that made
terminals. I asked how come there were only a few
bypass caps instead of the traditional one per chip,
and the chief engineer (who wrote The famous
Microcomputer Design book, Don Martin of Martin
Research), who definitely knew how to build things,
told me that the board would start with them all in.
Then they would be removed until the board stopped
working. Then they'd put that one back in! Seriously,
they would look at the power supply and use just
enough with a little extra margin. Costs add up, same
with poly caps, crystals (which used to cost a lot
more than they do now), etc. Maybe an engineer would
have liked to use a part, but just couldn't get it
simply because of supply condsiderations. Skilled assy
line labor used to be cheaper, too. Everyone on our
assy line knew how to use a scope, and often using one
was part of the assy process. So at that time it was
cheaper to have a tech/assy line person tweek a pot
than it was to put in a crystal.
It's my understanding that electrommagnetic deflected
vector displays take very, very high-power deflection
coils and drivers, and this is where the real money is
in these units. I don't know if the Imlac is
electrostatic or electromagnetic deflection
(electromagnetic, I suspect).
I am working on a TTL-based vector display (128x128
dots only), from a 1975ish BYTE maganize. I'm using a
Tektronix 620 display, which I can get for $15 in
clean, paint, and use condition from a local surplus
place.
--- Bob Shannon <bshannon(a)tiac.net> wrote:
Ben Franchuk wrote:
<snip>
It is not that they did not know better, that is
all they had to work
with.
Clearly you haven't looked at the Imlac schematics!
Engineers designed
lots of hardware
of the same era without resorting to the kinds of
tricks used in the
Imlac.
Did you know that each Imlac had to have RC networks
that control its
clocks and timing signals
tuned by hand, for each unit? The components used
were garden variety
ceramic caps with very
loose tolerances, not poly or other higher quality
components. The issue
here is the quality of hardware
engineering. HP machines of the same period are
vastly better
engineered, as are many other machines
from the late '60's and early 70's.
To suggest that nothing better was available to the
engineering team at
Imlac is laughable at best.
Mind you cost cutting often did not help any
computer product.The real
problem
is the Imlac is VECTOR display. Finding a new CRT
would be a problem
with the
original design. A raster scan display design
could be used but then you
need to buffer the display correctly to have
unrefreshed data fade off
the screen.
The vector display has nothing to do with the design
quality whatsoever!
What makes you say something like this Ben? For
what the Imlac did,
when it did it, VECTOR was
FAR SUPERIOR to raster displays.
Please note, the Imlac had a 1024 by 1024
addressable display, prior to
1970. This greatly exceeds what
was possible with raster graphics at the time, and
the Imlac was
designed for calligraphic applications where
its short vectors made for a mugh higher quality
display than a
pixellated raster display of the same resolution
would have. There is also the fact that
manipulating raster display is
far more computationally intensive than
manipulating a vector display list. The Imlac CPU
would not be well
suited for raster graphics at all, but its more
than sufficient for its intended use.
The problems with the Imlac are issues of
engineering quality, like the
total lack of decoupling capacitors, poor
grounding, and poor logic design. This is also
reflected in the
manufacturing of early units in the failue to wash
the
etchant off the boards (many Imlac boards now have
fuzzy green etches,
or no remaining etches at all) and poor metal
preperation prior to
painting, and the fact that the design was very
quickly repackaged as
the PDS-1D's.
The CRT used in the Imlac was common enough in its
day, and that same
tube was also used in much higher quality products
as well.
The last comment about the display refesh is hard to
understand. The
Imlac does not use a storage tube, and it
must keep the display refreshed in the same way as
any raster graphics
display does. I'm not at all sure I understand your
point here, can you
expand on this point?
If your suggesting re-implementing the Imlac with a
raster display, this
is totally impractical. My ReImlac project will use
a real vector
display, as that is the only way to duplicate the
capabilities of the
original machine. Remember, the Imlac does not have
jagged vectors even
when drawing at any angle. To try to do this on a
raster display would
reqire a resolution far far greater than 1024 by
1024. The anti-alising
along would take more logic than the full original
machine does.
On the other hand, a Wells Gardner vector monitor is
more than able to
display the vector video from an Imlac exactly the
way a real Imlac
does. So will most small oscilloscopes, or even a
modified TV monitor.
Small vector display monitors are fairly common on
eBay at very
affordable prices. So whats the problem with a
vector display?
To be true to the original, I'm sticking with a true
vector display.
After all, I'm quite addicted to vector (and point
plot) displays, and
this was my main attraction to the Imlac. If this
were replaced by a
raster display, you might as well run a software
emulator and not bother
re-implementing the Imlac in hardware at all.
The vector display of the Imlac is a thing of beauty
and is a huge part
of what makes an Imlac so unique. To call this
"...the real problem..."
is heracy!
__________________________________________________
Do you Yahoo!?
Yahoo! News - Today's headlines