Microsoft-Paul Allen
Jim Manley
jim.manley at gmail.com
Mon Oct 22 16:40:47 CDT 2018
Hi Liam,
On Mon, Oct 22, 2018 at 8:15 AM Liam Proven via cctalk <
cctalk at classiccmp.org> wrote:
>
> Cairo was intended to be semi "object oriented" ...
>
This reference to "object-oriented" is way off, conflating GUI "objects"
and true object-oriented software. OO in code has nothing to do with
manipulating virtual objects (desktops, icons for folders, documents by
type, trash cans, etc.). It's a combination of attributes supported by
programming languages, such as classes, methods, encapsulation,
abstraction, inheritance, polymorphism, composition, delegation, etc. Even
Ivan Sutherland's 1960 - 1961 Sketchpad truly implemented object-oriented
design at both the GUI and code levels, despite being developed for the
discrete-transistor MIT Lincoln Lab TX-2, with all of 64 K-words (36-bit
word length) of discrete-transistor memory.
> Win95 was astonishingly compatible, both with DOS drivers and apps,
> and with Windows 3 drivers, and yet it was 32-bit in the important
> places, delivered true preemptive multitasking, reasonably fast
> virtual memory, integrated networking, integrated Internet access, and
> a radical, clean, elegant UI which *every single OS after it* has
> copied.
>
Ummmm ... no. You're apparently completely uninformed about MIT Project
Athena, aka The X Window System, or X11, or just X, for short, and no, it's
not plural. The latter is ironic, because MS Windows only supports one
windowing desktop per user, while X Window not only supports multiple
desktops per user (each with its own context that can be swapped in to
occupy the display area), but natively supports remote desktop access from
a number of users over networks (MS Windows still doesn't support this).
While the early aesthetics of X's icons, windows, widgets, etc., are just
what you'd expect some harried engineer to cobble together well after
midnight the date of a major early release, the underlying technology was
light years ahead of what MS spent decades screwing around with (per your
description of the dead ends). Unfortunately, X, as well as other earlier
GUI systems, was bitmap-based, and still is, so, the aesthetics haven't
been improved over the past three-plus decades it's been around, despite
incredible advances in graphics hardware, which was designed from the
ground up to largely support floating-point computations necessary for 3-D
and advanced 2-D graphics.
Interestingly, the Raspberry Pi Foundation has found it necessary to spend
a considerable amount of its meager resources (compared with those in
commercial developers' piggy banks, emphasis on the "piggy") to GPU
hardware accelerate X, that its Debian-based Raspbian OS uses for its GUI
(the changes to open-source code are released upstream to benefit the
entire Debian community). 99% of the die area on a Pi's system-on-a-chip
(SoC) is the GPU, which is what boots on power-up. The ARM CPU in the SoC
was originally included as just a traffic cop for shoveling video data
coming in from the Ethernet port and routed to the GPU for decoding and
generation of HD video signals in Roku 2 streaming media boxes. The
acceleration included conversion from the integer bit-mapped
representations used in X to floating-point data structures on which the
GPU is designed to primarily operate. When you're limited to one GB of
RAM, your CPUs are RISC-based, and the CPUs' clock speed is limited to 1.4
GHz, you need all the help you can get.
BTW, MacOS X is based on Mach, the version of Unix that was designed for
multiple, closely-coupled processors, and it, too, uses X as a basis for
its GUI. Even in its early days, the Mac graphics system had a lot to
admire. When the Mac II brought color video and full 32-bit processing to
the product line, the OS was very cleverly provided a single System32
extension file that only had to be dropped into the System folder to make
older black-and-white-only, 16-bit external-to-the-microprocessor (even the
68000 is 32-bit internally) Macs compatible with the new 32-bit,
color-based graphics architecture. No changes were necessary to
applications, with colors merely mapped to dithered patterns of
black-and-white pixels having equivalent luminance as the colors on the
older hardware.
As for multitasking, even Windows 10 can easily get bogged down where the
GUI becomes essentially unresponsive to user actions. MS has never grasped
that it should never be possible to wind up in a situation where the user
is stuck watching a rainbow-colored wheel spin, while some set of tasks
consumes pretty much every clock cycle on every core, and the user can't
even shift context away from whatever is hogging the system.
Other than completing a valid low-level task, such as flushing queues to
large-capacity storage, the user should always be in control of what the
foreground process with highest precedence is. Loading ads from an
incoming network connection for products and services, that the user has
absolutely no interest in, is never higher than the lowest-precedence
task. It's pretty obvious that advertising business deals abrogate that
user-centric edict, where making yet-more bucks on the user's back is
considered the highest priority.
One reason that Apple has never had a problem with users being
dispassionate about its products and services is that, when a new employee
starts at the company at any level, they spend their first week, or so, in
a user experience (UX, aka usability) lab, as soon as the HR paperwork is
done. They're given a set of goals to accomplish on a new product/service
or revision, and video is taken of the display and of them. They're then
interviewed by UX folks to find out why they did certain things to
accomplish a goal, even where goals were inserted to provide enjoyable
breaks from more serious work. By observing and interacting with people
from all walks of life and levels of experience, they've been able to build
a huge treasure trove of why people do things, which sometimes are just "It
seemed like a good idea at the time."
A year or so before Steve Jobs passed away, Bill Gates was asked why MS
bailed out Apple in 1996, when Steve returned and Apple was within a few
months of not being able to make payroll. He flatly stated that Apple's
products and services were great inspirations to MS and they couldn't let
such a valuable source of R&D disappear (and that Apple customers had paid
for ... a lot).
He also knew that the key people at Apple would never work for MS, and
would rather take their chances with a startup than wind up in an org chart
that involved being anywhere below Steve Ballmer. They didn't really need
any more money that badly, anyway - Apple made a lot of very talented
people very rich. It also didn't hurt that MS was going to get access to
Apple's patent portfolio for some time, either, and would benefit
significantly in a financial way when, not if, Apple rebounded. Steve and
Bill were actually pretty good friends when out of sight of everyone else,
while also being extremely fierce competitors in public.
None of this is to say that Apple has always been right and MS has always
been wrong - far from it. Steve was infamous for saying that Apple would
never do certain things, often for years ... until they did. He even went
as far as to construct major logical arguments against them in keynotes ...
and then blew up those arguments in the keynote address the following year,
albeit usually by defeating a Kobayashi Maru scenario by changing the rules.
The iPhone was the best example of this - after swearing there would never
be an iPhone for years, they actually shipped the original version, not
only without an elegant copy/paste mechanism, but no means of performing
copy/paste at all for the first year, let alone not provided a means for
anyone outside Apple and its partners to create native apps. They hadn't
gotten copy/paste right, and weren't going to deliver something they
weren't as proud of as their first-born children.
This would have been suicide for any other company, but Steve held the helm
steady, with customers believing that Apple would make things right, and
they did, eventually, in spades. It was recently revealed that Steve was
the most surprised person on the planet when the App Store not only
surpassed their wildest dreams, but just kept growing, still with no end in
sight today. I'm guessing that there was a knock-down, drag-out fight,
where he wound up placing a sword of Damacles over the head of whomever
disagreed with him about it if it didn't succeed.
When Steve Wozniak appeared on stage with Jack Tramiel to celebrate the
30th anniversary of the launch of the Commodore 64, it was noted that
Commodore had acquired MOS Technology, manufacturer of the 6502
microprocessor used in Apple's products. When asked why Commodore didn't
constrain the supply of 6502s to Apple, its arch-rival at the time, Jack
replied, "The middle name of Commodore Business Machines is Business, and a
good businessman doesn't screw his best customer. We actually made more
profits from the 6502 than we did from the C-64, despite the much larger
revenue associated with the latter."
The Woz was then challenged about Commodore 64 sales far exceeding those of
Apple ][ and //e models, and he replied, "At Apple, we were always in it
for the long haul. What has Commodore sold lately?" Commodore, of course,
had long since gone bankrupt. It's hard to believe that 6502-based Apple
models sold for an incredible 16 years - imagine that happening today. Of
course, we can't - the world is unrecognizable today, in terms of
6502-based computing. Now, the House that iPod/iTunes/iPhone/iPad Built
has been the first to achieve a sustained 13-digit market capitalization
valuation. All that, without the ugliness of having to sign a consent
decree for abusing market power through illegal shenanigans, on top of
claiming that a browser couldn't be divorced from an OS ... and just that
was then demonstrated in front of the presiding judge and the world ...
oops.
All the Best,
Jim
More information about the cctech
mailing list