Let me push
the question a bit more....
-How do they squeeze that much data down the line? Fiber-optics cabling
only? Data multiplexing?
What about trying to think ? How do they squeze some 160+ MBit, as needed
for a simple TV chanel within a cable (~512x625x25x16) - and they put not
only one one, but rather some dozends of _uncompressed_ chanels within one
cable, one simple copper cable as installed in hundreds of million places
and they do it since 40+ years .... much data come on - if you would ask
for 1,000 GBit connections maybe :)
Well, within a cable signal, it's analog (you can therefore relate that to
a parallel digital signal), it's also multiplex using different frequencies...
So, I assume that the answer is, gigabit ethernet uses some local
oscillators and modulates a butt-ton of frequencies using many parallel
bits. That would be the closest correlation to how cable TV works.
> -How do they discriminate the beginning of one
packet from the end of
> another when running at such high speeds?
Well, I was assuming that gigabit ethernet was a single serial connection
and that the packet header detector has to operate at frequencies several
multiples faster than the incomming data stream to detect new packets then
send a complete packet to a processor for address resolution. But, I take
it this is not the case. So, how do they do it?
How can they seperate even a single Byte from each other at the ridicoulous
speed of 300 Bd - even more, how they seperate two bits ?
C'me on - it's the same water than ever - and it boils still at the same
pressure/temperature combination.
Gruss
H.
--
Ich denke, also bin ich, also gut
HRK