Some computing economics history:
I'm an engineer and scientist by both education and experience, and one
major difference between the disciplines is that engineers are required to
pass coursework and demonstrate proficiency in economics. That's because
we need to deliver things that actually do what customers think they paid
for within strict budgets and schedules, or we go hungry. Scientists, on
the other hand, if they can accurately predict what it will cost to prove a
theory, aren't practicing science, because they have to already know the
outcome and are taking no risk. A theoretically "superior" encoding may
not see practical use by a significant number of people because of legacy
inertia that often makes no sense, but is rooted in cultural, sociological,
emotional, and other factors, including economics.
Dvorak computer keyboards are allegedly far more efficient
speed/accuracy-wise than QWERTY computer keyboards, so they should rule the
computing world, but they don't. Keyboards that reduce the risk of
repetitive stress injuries (e.g., carpal tunnel syndrome) should dominate
the market for very sensible health reasons, but they don't, either.
Legacy inertia is a beyotch to overcome, especially when
international-level manufacturers and investors have a strong
interest making lots of money from the status quo. Logic and reasoning are
simply nowhere near enough to create the conditions necessary for
widespread adoption - sometimes it's just good luck in timing (or, bad
luck, as the case may be).
ASCII was developed in an age when Teletypes and similar devices were the
only textual I/O options, with fixed-width/size/style typefaces (font
family is an attribute of a typeface - there's no such thing as a "font").
By the late 1950s, there were around 250 computer manufacturers, and none
of their products were interoperable in any form. Until the IBM 360 was
released in 1965, IBM had 14 product _lines_ that were incompatible with
each other, despite having 20,000+ very capable scientists and engineers on
their payroll.
You can't blame the ASCII developers for lack of foresight when no one in
their right mind back then would have ever predicted we could have upwards
of a trillion bytes of memory in our pockets (e.g., the Samsung Note 9),
much less multi-megapixel touch displays with millions of colors, with
worldwide-reaching cellular/Internet access with milliseconds of round-trip
response, etc.
Someone thinking that they're going to make oodles of money from some
supposedly new-and-improved proprietary encoding "standard" that discards
five-plus decades of legacy intellectual and economic investment, is
pursuing a fool's errand. Even companies with resources at the level of
Apple, Google, Microsoft, etc., aren't that arrogant, and they've
demonstrated some pretty heavy-duty chutzpah over time. BTW, you won't be
able to patent what apparently amounts to a lookup table, and even if you
copyright it, it will be a simple matter of developing
functionally-equivalent code that performs a translation on-the-fly. See
also the clever schemes where DVD encryption keys, that had been left on an
unprotected server accessible via the Internet, were transformed into prime
numbers that didn't infringe on the copyrights associated with the keys.
True standards are open nowadays - the days of proprietary "standards" are
a couple of decades behind us - even Microsoft has been publishing the
binary structure of their Office document file formats. The specification
for Word, that includes everything going back to v 1.0, is humongous, and
even they were having fits trying to maintain the total spec, which is
reportedly why they went with XML to create the .docx, .xlsx, .pptx, etc.,
formats. That also happened to make it possible to placate governments
(not to mention customers) that are looking for any hint of
anti-competitive behavior, and thus also made it easier for projects such
as OpenOffice and LibreOffice to flourish.
Typographical bigots, who are more interested in style than content, were
safely fenced off in the back rooms of publishing houses and printing
plants until Apple released the hounds on an unsuspecting public. I'm
actually surprised that the style purists haven't forced Smell-o-Vision
technology on The Rest of Us to ensure that the musty smell of old books is
part of every reading "experience" (I can't stand the current common use of
that word). At least I have the software chops to transform the visual
trash that passes for "style" these days into something pleasing to _my_
eyes (see what I did there with "severely-flawed" ASCII? Here's how you
can do /italics/ and !bold! BTW.).
Nothing frosts me more than reading text that can't be resized and
auto-reflowed, especially on mobile devices with extremely limited display
real estate. I'm fully able-bodied and I'm perturbed by such bad design,
so, I'm pretty sure that pages that prevent pinch-zooming, and that don't
allow for direct on-display text resizing/auto-reflow, violate the spirit
completely, if not virtually all of the letters, of the Americans with
Disabilities Act (and similar legislation outside the U.S., I imagine).
Your mileage may vary, objects in mirror are closer than they appear, do
not spindle, fold, or mutilate, do not expose to fire or flame, batteries
not included, and all of the other legalese disclaimer garbage may
apply that lawyers produce, who aren't being kept busy enough looking out
for widows and orphans, which should be their full-time vocation.
Soapbox hereby happily relinquished to the next blowhard ...