On 2015-Mar-16, at 7:20 AM, John Foust wrote:
I'm trying to understand at a low level how some
early computers
and game consoles generated a non-standard form of NTSC.
The Wikipedia
http://en.wikipedia.org/wiki/Low-definition_television says:
"Older video game consoles and home computers generated a nonstandard NTSC
or PAL signal which sent a single field type which prevented fields from
interlacing. This is equivalent to 240p and 288p respectively, and was
used due to requiring less resources and producing a progressive
and stable signal."
Another source says this was true for the "NTSC Atari 2600, Apple II
family, Commodore 64, Nintendo Entertainment System, Sega Master
System, and the vast majority of games for NTSC Genesis, Super NES,
PlayStation, and Nintendo 64."
This page
http://www.hdretrovision.com/240p/ calls it a "special timing
signal" and gives examples of how contemporary flat-panel TVs can
misinterpret the old signal. The issue has spawned the creation of
dozen of devices to give the retro look on new TVs.
If you're just referring to the interlaced vs. 'progressive' scan issue, it
comes down to the relationship between the vertical and horizontal scan rates.
The standard RS-170/NTSC sync-pulse frequencies of V=60Hz and H=15,750Hz [*], give an H/V
ratio of 262.5, which of course is the number of lines which will be scanned during the
vertical scan period. The half fraction results in the vertical sync '
interrupting' the last line of half the scan fields half way through the line and the
first line of the other half of the fields starting halfway through the line so the fields
are vertically offset slightly and thus interlaced.
(Multiplying 262.5 by 2 gives the proper 525 NTSC lines per frame (and of course notice
it's an odd number)).
When ones looks at the V and H sync pulses in relation to each other, they are
alternating/oscillating in relative phase to each other.
In the properly-implemented standard, that alternating phase relationship necessitated
equalizing pulses in the signal at twice the H rate around the V sync/retrace period to
keep the old analog-implemented sync-separators happy.
If the H/V ratio is adjusted slightly off the standard to an integral relation, for
example by setting the H scan rate to 15,720Hz to give 262 (lines), the H & V sync
pulses are always in the same phase relationship, and there's no interlacing.
The altered scan rates were still within the sync-lock range of the analog TVs of the day,
perhaps needing a slight tweaking of the horizontal-hold control.
I was trying to find what the -exact- frequencies are for some of those early
computer/game systems but nobody seems to want to readily present them on the interwebs.
[*] In NTSC, the rates were adjusted slightly from the original B&W spec with the
introduction of colour, to, 59.94 & 15,734; IIRC, to deal with the colour information
in the signal displaying artifacts when displayed on existing B&W TVs.