That's nuts.. so not only is the monitor relying on the computer to feed it
a sane horizontal pulse, but that pulse rate isn't fixed or
even securely delimited in hardware? Just POKEing the wrong memory location
or writing bad data to some software register can set a catastrophically
destructive video mode?
Ok, yeah, I think I see the merits of having a fail-safe system ala the
oscillator in the Zenith units we've elsewhere discussed. But of course,
this does raise yet another obvious question.. why feed the monitor with a
H-sync from the computer hardware in the first place?
In the Zenith / Osborne situation, the only possible answer would seem to
be "No need to ever adjust (and as such, no need to even implement) a
manual horizontal hold control". But for the IBM PC / 5151, it's a little
less clear.. was there a perceived or real need to be able to
software-switch the hotizontal (and/or vertical) scan rates? Some kind of
text / framebuffer vs. hi-res mode-timing situation as we see in more
modern (SVGA etc) monitors?
FWIW, way back in the bad old days of manually editing /etc/XF86Config, I
do recall various warnings about tinkering with / feeding a bad modeline to
the wrong type of monitor.. could result in major damage, and so on. So I
guess this sort of thing went on for a very long time, then.. same basic
situation?
On Sun, Nov 9, 2014 at 10:43 AM, Chuck Guzis <cclist at sydex.com> wrote:
I'll add this to the 5151 issue. Because the
5151 responds to *every*
horizontal sync pulse, it's quite possible to toast the flyback transformer
with the wrong frequency. I know--I've done it.
Contrast with a simple astable multivibrator which would simply lose synch
.
--Chuck