Subject: Re: these RTL or what?
From: woodelf <bfranchuk at jetnet.ab.ca>
Date: Sun, 07 Oct 2007 08:49:56 -0600
To: General Discussion: On-Topic and Off-Topic Posts <cctalk at
classiccmp.org>
Allison wrote:
Does it really makes that much differnce the
number of bits for a char?
Really, Six bits was kinda tight for work where upper or lower case
was used but it didn't affect calculating Pi to a 100 places.
Wasn't the basic chunk 9 bits for PDP10 and it happened (DEC
software) used 6 bit char notation as a carry over from earlier
life with friden flexowriter and TTYs on earlier machines?
Floppy disk is 8 bit I/O. That made all the difference when standard
floppy disk controlers came out. Ben.
RX02 works with PDP-8, WD1793 works with Cmos6120 (PDP-8), VAXen and
other long word machines are using floppy and other 8bit interfaces.
VAX 780 microcode was loaded from floppy.
Most of those systems had already dealt with the 8bit/n-bit issue
and as devices got larger and space less an issue it became less
an issue. If it were, then PCs would have 32bit wide HDC rather
than 16bit.
With that character representation and word size are at best
only loosely associated or an OS convention. If anything ASCII
was a standard as were a few others like IBMs scheme. Converting
from one coding to another was one of the first apps
(code breaking).
Going from one character representation to another is generally
is not a big task so long and it's not language translaton. We as
early users did that often for devices like Seletric printers.
Did character convention used affect system choice or OS choice,
possibly. It was only a piece of a larger picture of how systems
evolved.
Allison
Allison
Allison