Text encoding Babel. Was Re: George Keremedjiev
ben
bfranchuk at jetnet.ab.ca
Mon Nov 26 16:13:43 CST 2018
On 11/26/2018 9:26 AM, Charles Anthony via cctalk wrote:
> On Mon, Nov 26, 2018 at 4:28 AM Peter Corlett via cctalk <
> cctalk at classiccmp.org> wrote:
>
>> On Sun, Nov 25, 2018 at 07:59:13PM -0800, Fred Cisin via cctalk wrote:
>> [...]
>>> Alas, "current" computers use 8, 16, 32. They totally fail to understand
>> the
>>> intrinsic benefits of 9, 12, 18, 24, and 36 bits.
>>
>> Oh go on then, I'm curious. What are the benefits? Is it just that there
>> are
>> useful prime factors for bit-packing hacks? And if so, why not 30?
>>
>>
> As I understand it, 36 bits was used as it could represent a signed 10
> digit decimal number in binary; the Frieden 10 digit calculator was the
> "gold standard" of banking and financial institutions, so to compete in
> that market, you computer had to be able to match the arithmetic standards.
>
> -- Charles
>
I say 20 bits needs to be used more often.
Did anything really use all the control codes in ASCII?
Back then you got what the TTY printed.
Did any one ever come up a a character set for ALGOL?
Ben.
More information about the cctalk
mailing list