It used to be the case that TVs synced to the mains
frequency but I don't
Was it?
A long time ago...
OK, I am sceptical. What do you mean by a 'long time ago'?
Unti lthe coming of the National Grid, different areas would be supplied
with mains at slightly differnet frequencies (even assuming it was
supposed to be nominally 50Hz), and therre would eb no phase relationship
at all. So usign that for any form of TV synchronisation is a non-starter.
So that really limits us to post-WW2 TVs. The oldest book of schematics I
have is dated 1953, but it contains models going back to 1948. Not one
attempts to derrive the vertical sync signal from the mains as far as I
can see.
I'm not sure why I'm getting into this, seeing as I know little about it.
However...
Perhaps studio sync sources were derived from mains power at one time purely
as a method of picking a reference to use so that there would be minimal
picture roll when transmission output is switched between different sources?
Maybe this was found to be counterproductive when large changes in load caused
momentarty variations in frequency even though the electricity producers worked
to average these out in the longer term?
(I don't have any guesses as to what might have been used for a line sync
source.)
If a three phase grid system is in use with individual receivers normally
connected to single phase supplies derived from it and having no choice as to
which phase they ended up getting powered by, it would seem to be rather
more difficult to attempt to use the power supply waveform as a sync source for
reception than to use the sync pulses present in the transmission.
Regards,
Peter Coghlan.