Eric Smith wrote:
There was an amazing hack for the original II/II+ to
generate
broadcast-quality NTSC video.
Jim wrote:
I read this as an oxymoron -- since the color
generated by the Apple II
was due to composite artifacting, and "broadcast quality" video
attempts to remove as much artifacting as possible...
You make it sound like there's some "magic" to artifacts, and that the
computer "tricks" the monitor. That's not how it works. The reason
people call color encoding on the Apple "artifacting" is because it
was cleverly designed to take advantage of digital signals at the
color burst frequency. That doesn't make the resulting waveform have
any less "colorness" to it; the monitor is correctly interpreting it
per the NTSC spec.
For instance, people talk about how when you put a green pixel next
to a purple pixel in an HGR mode, you get white as an "artifact". But
that's because the NTSC signal that you get out for that pixel pair
is essentially DC, so there is no color component. Naturally the
monitor must interpret this as white. It's not the monitor being fooled,
it's the naive programmer.
The main part of the NTSC spec that the Apple II doens't meet is the
relationship of color carrier timing to horizontal timing, which is
supposed to be exactly 227.5:1, but on the Apple II was deliberately
made 228:1 instead, so that the color carrier phase is the same on
each line rather than alternating phase by 180 degress. The Apple
has some other minor timing deviations, and the main oscillator usually
isn't adjusted accurately to meet all the other specs (e.g., FCC spec
on color carrier of +/- 10 Hz, wich is better than 3 PPM). But no
other consumer gear normally meets that spec either.
Anyhow, the color encoding on the Apple II works fine. That's not
the problem.
what did the graphics look like?
They looked exactly like they would on a normal Apple II. They just
had correct timing.
What exactly did the board to do the color graphics
output?
It rendered it as NTSC-compliant output.
Eric