On Jan 22, 2014 6:58 PM, "Chuck Guzis" <cclist at sydex.com> wrote:
On 01/22/2014 05:04 PM, Eric Smith wrote:
>
> The NEC uPD7201(A) and Intel 8274 have a misfeature that makes them
almost
> unusable with modems that employ synchronous
modulation (which is almost
> all common PSTN modems at bit rates of 1200 bps or higher, except bell
202
> and V.23), except when an error control (e.g. MNP
or V.42) used between
the
modems and
flow control is used between the modem and the 7201/8274.
I've put Z80 SIO/DART and 8274 chips in real production products. They're
similar, but obviously not identical. However, if you've programmed one,
you'd be right at home with the other.
That being said, I've never used one in synchronous mode. For that, I've
always used 8251A or 2651/2661 chips, with a preference for that latter.
The described problem occurs when using async mode, not sync, but when
using a modem that does synchronous modulation, which is all modems that
use modulations fancier than FSK, i.e. voice-band POTS modems at 2400 bps
and up, and full-duplex 1200 bps, and when NOT using MNP or V.42 error
control implemented in the modem. This means that it was a common
occurrence when using the uPD7201/8274 with early 212, V.22, and V.22bis
modems. It mostly was not seen with V.32 and later modems because those
almost always use V.22 error control.
I'm not sure what other common microcomputer products used the
uPD7201/8274, but this problem was very well known in the mid-1980s among
AT&T 7300 and 3B1 users. It was especially aggravating if you had spent the
money for a modem with MNP and/or V.22, but the modem bank you called into
did not support it.
The 8251A, 2651, 2661, 2681, 6551, 6850, 68681, Z80-SIO, Z8030/8530 SCC,
and almost every other UART and USART in the known universe did not have
this problem. It was a poor design choice (ay least in hindsight) by the
uPD7201 designers, in not allowing their chip to accept a slightly short
stop bit. I think everyone else accepted not due to planning for V.14, but
simply because they copied the sampling method used by the earliest UARTs,
which just happened by chance not to be affected by slightly short stop
bits.
Of course, if UARTs that couldn't tolerate short stop bits had been
commonplace, the CCITT would probably not have promulgated stop bit shaving
in V.14, and would have had to come up with a different method of rate
matching.