I'm puzzled as to how one could drive both
interlaced and
non-interlaced monitors off the same video signal - wouldn't the
interlaced one need a video signal which has 'odd lines, then a
vertical retrace, then even lines, then a vertical retrace'?
So to sort of answer my own question, interlaced and non-interlaced video
signals are indeed different.
It turn out that 1024x768 was defined by IBM as XGA, and it was originally an
interlaced format - although a non- interlaced version was done later. So my
laptop quite possibly really is producing interlaced video...
Although how a monitor is supposed to tell whether a signal is interlaced, or
non-interlaced, is not clear - there's certainly no pin on the VGA connector
which says so! :-)
Anyway, so which one is the one which is the number to
look at when
considering if the refresh rate is so high it might be dangerous to an
old CRT monitor?
E.g. my HP M50 manual says "Setting the screen resolution/refresh rate
combination higher than 1024x768 at 60 Hz can damage the display."
Since the monitor I'm using is called an "Ultra VGA 1024", I'm going to
assume
it can handle 1042x768, and just stop worrying about it... If it melts down
the monitor, it melts down the monitor! :-)
From: Jochen Kunz
Sounds like an interlaced video mode. No surprise that
the LCD can't do
this.
Yes, as soon as I realized it probably really was interlaced video, it became
obvious why none of my LCD monitors would display it.
Noel