Arfon Gryffydd <arfonrg(a)texas.net> wrote:n
Well, within a cable signal, it's analog (you can
therefore relate that to
a parallel digital signal), it's also multiplex using different frequencies...
So, I assume that the answer is, gigabit ethernet uses some local
oscillators and modulates a butt-ton of frequencies using many parallel
bits. That would be the closest correlation to how cable TV works.
No. What you describe is a broadband network. AFAIK, gigabit ethernet over
copper still uses baseband techniques.
However, it's not accurate to characterize the difference between baseband and
broadband as digital vs. analog. A high-frequency signal on a cable is
*always* an analog signal. Cables won't pass digital signals. If you put in
a square wave at one end, you don't get a square wave at the other. For
sufficiently low frequencies on suitably terminated cable, you may get a
good enough approximation of a square wave. But this doesn't scale up
to very high frequencies. So usually the waveform driven into the cable
(and expected at the other end) is a carefully engineered analog signal.
It is the job of the transceiver to generate that analog signal at the
transmitting end, and recover the digital data at the receiving end.