On 01/23/2014 09:47 PM, Mouse wrote:
It carries just as much information as the start bit.
Bits do not
necessarily carry information in the sense of anti-redundancy; for
example, the high bits in the octets making up this email carry
little-to-no information, since they're all the same, but they're still
real bits just the same.
No, stop bits are nothing more than a break in the bit stream that
allows the "start bit" to be synchronized upon--in fact, while there may
be a minimum length for them, there's no practical maximum length.
Synchronous transmission has neither start nor stop bits--after
synchronization with the transmitter is achieved, any gaps in the
transmission are filled with "idle" characters, whose sole purpose is to
mark time (just like the "stop bit")--they're discarded by the receiver.
I'm not sure what's "5-level" about
it, but I think "just plain
insanity" is overstating the case; it seems fair to me to consider it a
kind of shorthand for "1.5 bit times".
Used with older Baudot/Murray teletype gear. But nobody (read your
datasheets) ever calls them anything but "stop bits", not "stop bit
time".
For reliable operation, receiving devices must
accommodate a slightly
short stop bit; otherwise, if the sender's clock is slightly faster
than the receiver's, streaming characters back-to-back will produce
phase drift and eventually a framing error.
A history question--was asynchronous comm ever intended at the outset to
be used for long (i.e. multi-kilobyte) messages? I seem to recall that
most high-speed leased lines used a synchronous link.
I don't know enough about most UARTs to comment on
them in this regard,
but I think I once saw documentation on one which implied that it was
willing to tolerate receiving a character that was only (what the
receiver's clock says is) only a little over 9.5 bit times long,
because it samples in the middle of (its idea of) each bit time,
starting half a bit time from the beginning of the start bit, and,
provided the stop-bit's sample shows the correct level, it is
immediately ready to accept a new start bit's beginning after that
sample.
But is there any point to that in modern gear? In fact, why not a PLL
to synchronize the receiving clock? Return to mark level by the center
of the bit cell should be more than adequate for most applications.
--Chuck