On Jan 28, 2014, at 7:43 PM, Chuck Guzis <cclist at sydex.com> wrote:
On 01/28/2014 03:30 PM, John Wilson wrote:
On Tue, Jan 28, 2014 at 01:47:18PM -0800, Chuck
Guzis wrote:
When did using async parity go out of style?
I certainly don't mind, because it seemed as if it was annoying (inadvertant
mismatch => systems stubbornly refuse to communicate) more often than it was
helpful (tells you about a transmission error so minor that you wouldn't have
noticed otherwise). Protocols already do their own error detection, and for
plain text the problems are usually obvious}i
Seems to me that E71 was pretty much a given in the ASR 33 heyday. Getting a telegram
with garbled characters was probably a very big deal to WU.
I don?t remember ever seeing a Model 33 with parity. And weren?t telegrams sent in 5
level code? If not at the end, then surely before ASCII appeared. And of course before
either, there was Morse code, which doesn?t come with parity either.
All parity can do is convert garbled characters into missing characters. Neither is good.
Telegraph operators probably relied on having good signal quality, ensuring adequate bit
error rate values, at which point parity is not particularly needed. And with Morse,
you?re probably relying on skilled operators ? ECC performed by trained brain cells.
paul