On Mon, Mar 16, 2015 at 8:20 AM, John Foust <jfoust at threedee.com> wrote:
I'm trying to understand at a low level how some
early computers
and game consoles generated a non-standard form of NTSC.
There were several non-standard aspects of the Apple II "NTSC" video
signal which made it not actually NTSC-compliant. It was mostly a
matter of timing. The most significant deviations were lack of
interlace (60 frames per second, rather than 30 frames per second each
comprised of two interlaced fields), and that each horizontal scan
line was 228 color carrier cycles long rather than 227.5, which was
done so that the color carrier phase relative to horizontal timing was
the same on all scan lines. Early revisions of the Apple II were also
missing the serrations in the vertical interval.
Some people claim that the way color was generated in the Apple II was
somehow "fake" and refer to it as "artifacts" that somehow
"trick" the
television or monitor, but in actuality it was just a clever way of
having the hardware produce various signal amplitudes and phases, just
as "real" color does. It was clever enough to be patented, but it's
not in any way "wrong" or "fake".
When other devices have an issue with Apple II video as a source, and
don't reproduce the colors properly, in my experience it is usually
the 228 vs. 227.5 color carrier cycles per scan line that is at fault.
In the early 1980s, Video Associate Labs sold a product for the Apple
II called the VB3 Microkeyer. This consisted of a long slot 7 card,
and a larger card that installed atop the power supply, connected by a
ribbon cable. You had to pull about a dozen chips out of the Apple II
motherboard (thus it would only work with the II and II+, but not the
IIe or IIgs), and install in their place ribbon cables to the VB3
boards. The results were:
1) The Apple II would generate fully-compliant broadcast-quality NTSC
output. (IIRC, there was a way to selectably disable interlace, which
would break NTSC compliance.)
2) proc amp and gen lock: The Apple II video could be overlayed on an
external video input. This could be done selectively (keying), and it
was also possible to do software-controlled "wipes" between two video
inputs, using the Apple video bitmap as the input selector.
3) A special "linear" high-res graphics mode was added, in which the
addressing of the frame buffer was linear rather than having the
normal Apple II interlaced addressing. This made it easier to program
keying and wipes, but could be used for other purposes.
I used and programmed one of these in the instructional television
studio of a community college. I wrote some "wipe" programs for it,
and licensed them to Video Associate Labs in exchange for a VB3 of my
own. Alas, I no longer have it. :-(
Years later Apple introduced a Video Overlay card, which did produce
broadcast-quality NTSC output, and would work in the II, II+, IIe, and
IIgs. It supported all of the IIgs graphics modes, even when used in
an earlier model computer.