On 31/01/2025 18:55, Steve Lewis via cctalk wrote:
re: on UARTS..
Didn't it basically standardize the process of that task of converting a
byte to bits and vice versa, in a fashion specified by RS232?
And do so at the above-300-baud rates, since that task was too stressful
for 1MHz processors to pull off on its own (in addition to whatever else it
was doing, like flashing a cursor on a CRT?)?
And the buffer just gave grace time for if one of the systems got overly
busy? (like when scrolling said CRT)?
Might a UART be an early example of an ASIC?
As I've mentioned elsewhere, "UART" is Intel's term for an ACIA; and
other names exist from other manufacturers. Just a buffered shift
register and other logic in the same chip.
I wouldn't call it an ASIC myself, but other definitions are available.
It isn't specific to a user application. I built a disk controller using
a 6550 to serialise the data so you can't even say it's specific to RS-232.
And before MSI you could (and did) serialise data in a standard from
using a shift register. Before that discreet logic. Before that
transistors, and before that mechanically. So I don't think it's fair to
say that single chip UARTS/ACIA/SCI/SCC standardised anything - they
just made things easier to implement using a standard that existed for
twenty years or more.