Old NTSC tricks: 240p?
Eric Smith
spacewar at gmail.com
Tue Mar 17 16:28:43 CDT 2015
On Tue, Mar 17, 2015 at 1:06 PM, Brent Hilpert <hilpert at cs.ubc.ca> wrote:
> [*] In NTSC, the rates were adjusted slightly from the original B&W spec with the introduction of colour, to, 59.94 & 15,734; IIRC, to deal with the colour information in the signal displaying artifacts when displayed on existing B&W TVs.
No, the 1000/1001 change doesn't in any way prevent that. Instead
televisions use filters to remove the color carrier from the
luminance. Originally this was done with low-pass filters, which
significantly reduced the available luminance bandwidth, and thus
horizontal luminance resolution. Starting in the 1980s (or maybe late
1970s?) comb filters were used that could better separate the luma and
chroma with less degradation of the horizontal luma resolution, though
it is imperfect so there still are some artifacts such as dot crawl.
What the 1000/1001 sync rate change did was reduce the interference
between the color information and the 4.5 MHz sound subcarrier used on
broadcast. The problem actually was not an issue for the transmitting
and receiving equipment of the day, but became an issue years later
with comb filters. The NTSC committee requested that the FCC change
the audio subcarrier frequency by 1001/1000, which would not have
caused any problem whatsoever, but for whatever inscrutable
bureacratic reason the FCC didn't want to do that, to they changed
EVERYTHING ELSE by 1000/1001 instead, causing all manner of annoying
issues, such as a frame rate error when converting film by telecine,
and the abomination of drop-frame time code.
More information about the cctalk
mailing list