On Fri, 15 Mar 2002 philip(a)awale.qc.ca wrote:
On 15-Mar-2002 John Lawson wrote:
The code was the first 8-bit standard code that
allowed characters, such
as those found on a keyboard, to be represented by the same codes on
many different kinds of computers.
Wasn't ASCII orignally 7-bit?
Isn't that IS 7-bit?
IBM's "extensions" of another 128 characters are NOT ASCII.
Other than "8-bit", which it isn't, which of THOSE characteristics were
NOT present in EBCDIC, or Hollerith?
"He
traveled all over the world defining what this code would represent.
This is the code that is still used in PCs today," Silberg said.
Huh?! All
over the world, but didn't seem to stray to a
non-english-speaking country. ASCII serves very poorly for those of us
who need accents.
Who PAID for those junkets?
It sounds like our college administrators who went to China "to recruit
students" FOR A COMMUNITY COLLEGE?!?!
Why "travel all over the world" at all when creating "AMERICAN Standard
Code for Information Interchange"? If they were creating "ENGLISH
LANGUAGE Standard Code for Information Interchange", then THAT would call
for travelling all over certain appropriate countries (Does the sun set on
the British empire now?)
If they were creating "WORLD Standard Code for Information Interchange",
or just simply "Standard Code for Information Interchange", THEN it would
call for some serious travelling.
But ASCII does NOT deal with the needs of any other alphabet.
Does Unicode work?
--
Grumpy Ol' Fred cisin(a)xenosoft.com