On Fri, Nov 30, 2018 at 3:28 PM Grant Taylor via cctalk <
cctalk at classiccmp.org> wrote:
On 11/30/2018 02:33 PM, Jim Manley via cctalk wrote:
There's enough slack in the approved
offerings that electives can be
weighted more toward the technical direction (e.g., user interface and
experience) or the arts direction (e.g., psychology and history). The
idea
was to close the severely-growing gap between
those who know everything
about computing and those who need to know enough, but not everything, to
be truly effective in the information-dominant world we've been careening
toward without nearly enough preparation of future generations.
I kept thinking to myself that many of the people that are considered
pioneers in computers were actually something else by trade and learned
how to use computers and / or created what they needed for the computer
to be able to do their primary job.
--
Grant. . . .
unix || die
Most people know that Newton's motivation for developing calculus was
explaining the motions of the planets, but not many know that he served as
the Warden, and then Master, of the Royal Mint, as well as being fascinated
with optics and vision (to the point where he inserted a needle into one of
his eyes!) and a closet alchemist. His competitor, Leibniz, was motivated
to develop calculus by a strong desire to win more billiards bets from his
fellow wealthy buddies in Hanover, the financial capital of Germany at the
time, while developing the mathematics of the physics governing the
collisions of billiard balls. Babbage was motivated to develop calculating
and computing machines to eliminate the worldwide average of seven errors
per page in astronomical, navigational, and mathematical tables of the
1820s.
Shannon and Hamming (with whom I worked - the latter, not the former!) were
motivated to represent Boolean logic in digital circuits and improve
long-distance communications by formalizing how to predictably ferret more
signal out of noise. Turing was motivated to test his computing theories
to break the Nazi Enigma ciphers (character-oriented, vs. word-oriented
codes) and moved far beyond the mathematical underpinnings of his theories
into the engineering of Colossus and the bombes. Hollerith was motivated
by the requirement to complete the decennial census tabulations within 10
years (the 1890 census was going to take 13 years to tabulate using
traditional manual methods within the available budget). Mauchly and
Eckert were motivated to automate calculations for ballistics tables for
WW-II weapons systems that were being fielded faster than tables could be
produced manually.
Hopper developed the first compiler and the first programming language to
use English words, Flow-Matic, that led, in turn, to COBOL being created to
meet financial software needs. John Backus and the other developers of
FORTRAN were likewise motivated by scientific and engineering calculation
requirements. Kernigan, Ritchie, and Thompson were motivated by a desire
to perform an immense prank, in the form of Unix and A/B/BCPL/C, on an
unsuspecting and all-too-serious professional computing world (
http://www.stokely.com/lighter.side/unix.prank.html). Gates and Allen were
motivated by all of the money lying around on desks, in their drawers, and
in the drawers worn by the people sitting at said desks, to foist PC/MS-DOS
and Windows on the less serious computing public. Kildall was motivated by
the challenges of developing multi-pass compilation on systems with minimal
microcomputer hardware resources.
Meanwhile, the rest of the computing field was motivated to pursue the next
shinier pieces of higher-performance hardware, developing ever-more-bloated
programming languages, OSes, services, and applications that continue to
slow down even the latest-and-greatest systems. Berners-Lee was motivated
to help scientists and engineers at the European Organization for Nuclear
Research (CERN - the Conseil Europ?en pour la Recherche Nucl?aire) organize
and share their work without having to become expert software developers in
their own right. Yang, Filo, Brin, Page, Zuckerberg, et al, were motivated
by whatever money could be scrounged from sofas used by couch-surfing,
homeless Millenials (redundant syntax fully intended), and from local news
outlets' advertising accounts. Selling everyone's, but their own,
personally-identifiable information, probably including that of their own
mothers, has been a welcome additional cornucopia of revenue to them.
Computer science and engineering degrees weren't even offered yet when I
attended the heavily science and engineering oriented naval institution
where I earned my BS in engineering (70% of degrees awarded were in STEM
fields). The closest you could get were math and electrical engineering
degrees, taking the very few electives offered in CS and CE disciplines.
Granted, the computer I primarily had access to was a secondhand GE-265
with drum storage (we each got a whopping 32 KBs for all of our software
development ... yeah, that's with a K). There was also a PDP-8 in a rack
on wheels movable between the various engineering labs - we had to plug it
into a wall outlet for power, and a phone line for its modem to connect to
our accounts on the GE-265. My senior year, we received an Evans and
Sutherland Picture System 1 hardware-accelerated, 3-D vector wireframe
workstation, mind-melded to a dedicated PDP-11/70 via a three-foot cube on
its Mass Bus, containing 1 MB of dual-ported static RAM ... that alone cost
a million smackers (a buck a byte!).
I had to wait until I earned my MSCS 10 years later (taking care of my
WW-II D-Day in Normandy/Bastogne/Bulge/etc., 101st Airborne Division
paratrooper, POW, and vet Dad, and cancer-stricken Mom during that entire
period) to "officially" be considered a computer scientist and software
engineer (by HR weenies). That was despite being a full-time practitioner
of both that entire decade, developing systems with a majority of
components involving data, information, and knowledge processing. That
turned out to be a blessing in disguise, as I've routinely had to show even
some younger CS PhDs what discrete transistors look like (going all the way
back to Bell Labs' 1947 dual gold point-contact germanium prototype) and
how they work when presenting artifacts during tours at the Computer
History Museum in Silicon Valley. It's also been handy in jobs to be able
to accurately describe the real-world effects of bone-headed decisions
based purely on academic theoretical beliefs.
In short, Necessity and Invention are Mothers ...