I wasn't aware of current loop, been reading about it today. I also came
across an article on how RS232 is reasonable for non-noisy environments,
but for a busy industrial floor interference can become an issue.
Read through some old GoogleGroup archive, discussions in 1997 of the buggy
8250 support in the IBM BIOS (or rather, the tragic situation of the ROM
BIOS adjusted to address the bug, but then couldn't fully utilize the fixed
hardware that came later).
I tried using the X00 FOSSIL driver on the '286 to see if it sped up
performance, and it actually did not (it did seem to work but I had to
actually step down to 38400 instead of 57600 to get a reliable data
exchange). But if I understand the intent of the FOSSIL driver, it
essentially adds a user defined receive/send buffer beyond whatever the
UART might provide. And that is convenient, since all programs that use
the serial port could benefit from that (or have to implement their own).
I was running MSD and noticed it could detect if a mouse is a present, and
I wondered how does that work on a general serial port and pre-PNP days?
And a comment here made me realize it probably "play games" with one of
those unused pins like RI or DTR. I imagine Microsoft Mouse started some
convention along those lines that others followed? I could see at least
identifying the attached device is not a modem.
Thanks for the collective clarification that, from its beginning, RS232 was
focused on supporting modem interaction, moreso than raw data exchange.
On Fri, Jan 31, 2025 at 5:26 AM Frank Leonhardt via cctalk <
cctalk(a)classiccmp.org> wrote:
On 31/01/2025 08:20, Steve Lewis via cctalk wrote:
Hey all! So, I've found myself studying up
on RS-232 this year for a few
reasons.
I'm mulling over doing an RS232 themed talk at June VCF. Not a super
exciting topic, but I do think that RS232 has an interesting history: In
the SAGE relationship, and as a follow up to (essentially) prior
telegraph
communication.
>From what I've read, "50 baud" was a kind of an initial goal to beat,
since
that's what the top telegraph operators could
achieve (in small burst,
probably not all day). And those operators did have to also deal with
things like start/stop "bits". Maybe it wasn't an intentional goal, but
just that it establishes why "50 baud" is generally the lowest we ever
see
mentioned (or, if you go slower than that, might
as well use the older
tech).
Then 75/110/130 baud to have digital-systems interoperate with classic
mechanical teletypes. Going any faster and those systems jam up or
overheat? These weren't yet called "serial ports", so I'm not sure
what
a
late 50s system would even call their equipment
that facilitate this data
exchange (since I'm not sure what kind of crystal-clock they even had
yet).
Then, was it the SAGE program that demonstrated the idea of doing this
kind
of data exchange across copper phone lines? That
is, the idea of
computers
collaborating not just in a room, but across long
distances (miles)? And
doing so by using an audio tone presentation? (they settled on around
3100MHz, which ended up translating to 300 baud? hence, that's basically
why the first digital to digital system data exchange settled on that
baud
rate, which was reliable on both 50 and 60Hz
power systems, and
meaningfully faster than prior 110 baud - so a good milestone to turn it
into a product, which was the Bell Model 103?).
I couldn't find much details (like a manual) on the Bell 101 equipment
(anyone seen one or have a manual?). But I did find the Bell 103 manual
-
the photo of its innards is grainy, so I
don't understand how the Bell
103
did 300 baud without a UART (and one of the
pinout lines I see did run
power, so not sure if that's-yet RS232 or not; I know RS232 was evolving
right at that same time circa 1962). I've about the 1970ish TR1402
initial DIP UART, with anything prior being an experiment (like a full
board concept by DEC).
I know from 1962, both RS232 and ASCII standards still took maybe another
decade to really gain traction as standards (at least, from what I've
read). Getting the world to comply with any standard always takes a lot
of
effort (for a practical reason of everyone still
having invested in the
older tooling that was still functional). But it's interesting how those
two standards are still in use (not in their original form, but least the
1967 revisions) - extending from Baudot.and late 1800s-tech on
telegraphs.
Does anyone know of any grocery stories using RS232 in the 1960s? I
think
barcode scanning was just introduced in that era.
I can just imagine a
smart grocery store owner, in the backroom programming their minicomputer
for payroll and inventory management. In FORTRAN and without a CRT?
Actually, in the 60s, I think included software would be negotiated with
the provider of the computer (well, I'm not sure how that differed
between
minis and mainframes).
I know early microcomputers used RS232 for keyboards (1974-1976 era).
The
IBM PC keyboard is essentially another form of
serial.
Well, sorry for the rambling - have other RS232 related questions, but
first wanted to focus on the historical aspects (and see if I'm somewhat
on
the right track at least).
-Steve
Hi Steve,
I don't find RS232 boring - it's what started my career :-)
A couple of points you might like to consider, which you may already
know but stuff you've said above doesn't spell it out:
RS232 is not serial - make yourself clear. Before RS232 the same data
format was used in current loop (often 20mA or 60mA).
RS232 (AKA V.24) is only understandable when you realise it was
connecting a terminal (or later computer) to a modem. It's very
specific, yet like most technology has been subverted for other
purposes. I've kept at last one full RS232 modem in my loft (it was
government surplus, and I used to to run a BBS in 1980). Things got
weird later, particularly with the Hayes Smartmodem, but modems were
dumb devices. The lines went straight through. There were two
oscillators (for FM) and the appropriate one was switched in by the TX
line being high or low. Likewise the data separator looked for a high or
low tone and flipped RX between -12V and +12V. These were all individual
boards!
Then there's the line control board, which operated RI when a high
voltage was connected and DTR looped the phone line through.
Most of this makes no sense in other applications!
RS232, being voltage based, was only suitable for short distances
compared to current loop. Basic electronics - voltages drop over
distance due to resistance so after a while you don't know what you'll
get at the other end. A current flowing through a loop being turned off
and on has to be the same when measured at any point on the loop. The
"break key", of course, simply broke the current loop while you held it
down signalling whatever was required.
The speed of transmission is interesting. A purely analogue modem (or
current loop) can operate at any speed as long as both ends agree. The
original limit was the mechanical terminal. 110 baud makes sense as a
target as it's a nice round 10cps for ASCII - start bit, and seven data
bits, one parity bit and two stop bits. Two stop bits were necessary for
mechanical timing.
If you do five bit Baudot with no parity you've got about 75 baud for
10cps. When the V.23 standard was being developed (asymmetric 1200/75)
it was considered that 75bps was the fastest typing speed required which
is another reason why 50 was the bear minimum and 75 a better target. My
the time I was involved 75 was the "standard" for teleprinters so most
people couldn't out-type them.
UART? Crystal? Much later. Originally the timing came from a fixed speed
motor in the terminal. Have a look at how a teleprinter works. It wasn't
uncommon to adjust the speed of the motor at both ends to go as fast as
it could without errors.
And for amusement, someone wrote in the PCW saying they'd heard a
salesperson at Radio Shack trying to convince a punter that RS is RS232
stood for Radio Shack.
Hope the talk goes well. I assume you're a long way off - otherwise you
could borrow some of the kit hereabouts (but it's Very Heavy).
Regards, Frank.