I don't know where you get "~7MHz 16550 serial card" from. The 16550 (and
predecessors) sample the incoming serial signal with a clock 16x the baud
rate. For 115,200 baud, that's 1.8432MHz, and it is no coincidence that
this
is a standard crystal frequency and also the maximum speed of a lot of
UARTs.
I was meaning something like the later modem SIIG serial cards that used a
7.3728 MHz clock (4x the 1.8432 "original"). I have one of these that is
combined with a 16550 UART.
Even before the IBM PC of c. 1981, I've come across PDP-11 systems with
3.6864MHz clocked UART (and some S-100 based systems, at this ~3 and also
~5 MHz clocks). Another PDP card that I have has a 0.9216 MHz clock (so
57600 bps cap). And yes, the PC-market settled on that 1.8432MHz tempo.
I just recently realized, the earlier POLY-88 system (8080-based) had a
"native" clock rate of exactly 1.8432 MHz, that being about 5 years earlier
than the IBM PC. So maybe it's not really fair to blame that on the pick
on IBM.
In my experience, a 12MHz '286 couldn't fully utilize the 1.8432MHz 8250
(i.e. in my testing, I was able to receive at 57600 over a null modem cable
on said '286, in a sustain {ZModem} stream without any errors; but at
115200 the system just got too many errors to be a worthwhile transfer).
That's how I mean there is a crossing point where the combined CPU, memory
speed, hard drive speed of the system is "under powered" for the full
potential of its UART. But it was a number of years before
"modulation-tech" modems blasted past even 9600 baud (at least at the
consumer level), so that 1.8432 MHz pick was fine for most of the 80's.
Bit of a tangent:
I noticed those modern-era FTDI USB/serial adapters, their specs list 128
byte FIFOs. Maybe those "parallel modem" Microm devices just had larger
onboard buffers, kind of "leap-frogging" the 16 byte buffer that many 16550
UARTs had? I've noticed those more modern FTDI USB/serial adapters have
128 bytes buffers (some 64 byte).
Anyway, if your multi-tasking 386 or 486 was struggling to support a higher
baud rate - I've read you could desolder a 8250 and plop in a 16550
directly, and gain the 16 byte buffer at least. If that still doesn't
help, then try one of those new 7.3728 MHz equipped 16550's?? In the
README docs included with the A00 FOSSIL driver c. 1990 (who ran a
multi-line FIDO BBS), I recall that author mentioning options like this -
and how it was the increase in multi-tasking operating systems causing
stability issues in older serial cards.
Anyway, the "parallel modem" is interesting in that makes it clear that
RS-232 isn't essential for modem communication. RS-232 was just a "means
to an end" of delivering data quickly to/from a modem device. I guess
using RS-232 gave the benefit that your modem didn't need to be physically
nearby (and I recall ISP offices where the modems in a closest down the
hall) - due to "screaming" at +/-12V ? This might be relevant in
situations where you didn't have a phone line in every room (not just in
homes, but also old offices)
-Steve
On Sun, Feb 9, 2025 at 6:18 PM Peter Corlett via cctalk <
cctalk(a)classiccmp.org> wrote:
> On Sun, Feb 09, 2025 at 12:08:47PM -0600, Steve Lewis via cctalk wrote:
> [...]
> > So.. If you had a slow system that couldn't really take advantage of a
> > ~7MHz 16550 serial card (or I guess like a laptop that was stuck with an
> > older UART) That might be the use-case where this parallel v.fast might
> > help (by being able to "feed the modem" fast enough to actually take
> > advantage of the faster modem speed?) Or is there some other scenario
>
> Note that all but the dumbest modems reclock the data before transmission
> and the XON/XOFF or CTS/RTS flow control is handled locally, buffering in
> the modem as necessary. At faster speeds there's no longer a 1:1
> relationship between the signal level on the RS-232 cable and the screeches
> going down the phone line. Start and stop bits are not transmitted, giving
> a
> 25% speed boost from that alone.
>
> A parallel-connected modem is a bit pointless except in weird environments
> where one's serial port is broken or otherwise unusable. Information theory
> tells us that you can't get more than 64kb/s out of a dialup link, because
> that's the speed of the underlying digital channel used by the PSTN. Due to
> the reclocking, you actually need a serial port capable of 80kbaud to not
> drop data, and the next-highest standard baud rate is 115,200 baud, which
> any half-decent PC serial port can handle.
>
> To pre-empt the obvious retorts from the peanut gallery, sure, one may well
> have an original IBM XT or BBC Micro or whatever whose serial port drops
> bytes when driven faster than 9600 baud. Guess what: the machine's so slow
> that it can't handle the firehose of data even if its serial port wasn't a
> basket case.
I don't know where you get "~7MHz 16550 serial card" from. The 16550 (and
predecessors) sample the incoming serial signal with a clock 16x the baud
rate. For 115,200 baud, that's 1.8432MHz, and it is no coincidence that
this
is a standard crystal frequency and also the maximum speed of a lot of
UARTs.
>
> [...]
> > Is there any "natural rate" (Hz) of a modem? Meaning is 1200/2400
> > baud-equivalent modem an accelerated-by-enhanced-encoding version of 300
> > bps? and 9600 likewise an accelerated-by-encoding version of 2400? is
> > 300bps itself some kind of special accelerated-by-encoding? I see 1200
> > baud was also still sub 3KHz
>
> These easy-to-ask questions have very long answers. But mostly, it was
> *not*
> a case of merely increasing the baud rate or the number of bits per baud
> with a different modulation scheme, but multiple concurrent advances which
> did those *plus* some other techniques which would maintain signal
> integrity
> despite the reduced SNR.
>
> If you want the full gory detail, the relevant ITU standards are
> freely-available. Bring a good signal-processing textbook.
>
> > (did any modem protocol go above 3KHz?).
>
> V.90 used the full 4kHz analogue bandwidth for the downstream. Yes, even
> the
> frequency extremes which were heavily-attenuated by the line filters. It'd
> just listen much harder.
>
>