woodelf <bfranchuk at jetnet.ab.ca> wrote:
Allison wrote:
Does it really makes that much differnce the
number of bits for a char?
Really, Six bits was kinda tight for work where upper or lower case
was used but it didn't affect calculating Pi to a 100 places.
Wasn't the basic chunk 9 bits for PDP10 and it happened (DEC
software) used 6 bit char notation as a carry over from earlier
life with friden flexowriter and TTYs on earlier machines?
Floppy disk is 8 bit I/O. That made all the difference when standard
floppy disk controlers came out. Ben.
There's nothing intrinsically 8-bit about a floppy disk. A serial
bitstream to be sliced/diced however you want.
Several word processors, using 8" hard-sectored floppies, had 7-bit
and 9-bit formats.
Now, IBM 3740 floppy format (from which almost all modern floppy
formats have descended) has a lot of 8-bitness in it. But it
was still used by 12-bit machines (with various packing/repacking).
There's a whole lot of crap in the IBM 3740 floppy format that
nobody has used for decades but is still carried around, too. (e.g.
"deleted data" marks).
Tim.