> I had words with Clancy and Harvey. While need
may be diminshed,
> there is never a complete elimination of the need to pay attention to,
> and optimize near, the level of hardware.
[top posted, with Swift's remarks below]
The Clancy and Harvey topic is about curriculum, and teaching of "computer
science". Clancy and Harvey were/are two lecturers at UC Berkeley, who
ended up in charge of lower division under-graduate CS.
First, I'll explain OUR curriculum at the community college, since that is
what I know. We had a varied mix of re-entry of dropouts, job training,
AA degrees, preparation for transfer to 4 year colleges, skills
enhancement for people employed in related jobs ("I want to take a course
in C"), and adult enrichment.
We started off with an extremely simple introductory course that did not
assume that the student had ever seen a computer. (In 1980 or so, that was
a necessary assumption). Offered as a 6 week drive-by, but generally a
full semester for those who intended to go further. It had just enough
programming in it that a student would successfully create a program.
(often done with BASIC)
Then an Introduction To Programming, with basic principles and concepts.
Illustrated with multiple languages, but concentrating on a language of
the instructor's choice for some minor projects.
Then choices of courses in multiple languages: COBOL (2 semester levels),
FORTRAN, BASIC, Pascal, RPG, C, Mainframe Assembly (360/370),
Microcomputer Assembly (X86), and occasional ones for only a few years
each including RPG, ADA, etc. Not all classes offered every semester.
Also work skills classes in using Microsoft Office, etc.
Data Structures And Algorithms, with a prerequisite of at least one
programming language, and taught in language of instructor's choice (I
used C, but I let students substitute other languages). When I taught it,
I included iterative V recursive tree algorithms.
I also taught "Microcomputer Disk Operating Systems" (heavy on MS-DOS, but
trying to actually teach the principles applicable to ANY), and "Advanced
Microcomputer Programming" (prerequisite of C and Assembly) which included
TSRs, device drivers, walking directory trees, mixed language programming
and stackframe structure, etc. One of the assignments was to write a
program that would display a complete directory of the disk. I taught
with C, X86 ASM, and MS-DOS, but students were free to substitute
other languages and unix, Mac, or whatever, for their assignments.
I tried to implement a basic information science course on access to
online information resources, but our curriculum committee vetoed it and
rewrote it into how to surf the web.
We had constant struggles with the administration, who wanted to, and
eventually succeeded at, removal of all advanced courses and anything with
a prerequisite. They REALLY wanted our department to be nothing but
remedial job training for the digital sweatshop. They killed EVERYTHING
that had been worthwhile!
By 2013, when I finally told them to take my 33 year job and shovel it, I
had still been unable to get them to let me try to do a basic beginning
Information Science course (DIK/W/E, relevance ranking, economics and
legalities of IP, recall/precision interaction, social impact of access,
search engine algorithms, SEO, etc.)
Anyway, back to, . . .
Clancy and Harvey reworked the UC undergraduate lower division (first two
years) curriculum. They setup a three course sequence at the core,
consisting of "Abstraction", "Data Structures", and
"Demystification".
They called a meeting of local CS departments to tell us what we should
switch over to teaching.
They chose Scheme (a LISP derivative) for the first course. They thought
that recursion should be the fundamental basis of computer programming,
even down to COUNTING from 1 to 10. I used to usually give an assignment
of Fibonacci numbers as an exercise in loop controls, etc. - I was a
little taken aback when somebody from UC thought that it meant that I was
starting my students off with recursion! They gave a small example, which
they declared CAN NOT (not "must not") be done with anything except
recursion. While they were putting it on the board, I coded it on my
notepad as a two dimensional array with a nested loop in C, BASIC,
Fortran, and got partway through COBOL. (I take offense at being told
things are "impossible" to do in other languages - "difficult" or
"inappropriate" are acceptable).
Their "Data Structures" class was to be taught using C. I asked about
their C class. They didn't have one, and declared that all students are
assumed to already know C before they arrived there!
Their "Demystification" would be the first time that the students would be
made aware of existence of anything under the hood.
They declared, "Nobody programs in Assembly language any more, nor ever
will again."
I asked about timeline for implementation of the new curriculum. They
stated that it had been in place for a couple of years. I pointed out
that it was not reflected in the current catalog, and asked when they
expected the catalog to catch up. They insisted that that must just be a
glitch in the printing of that years catalog, that it had been updated for
a few years. (THAT was a lie. I went over to Doe Library and xeroxed the
pages from the last five years of catalogs)
They had a brilliant visionary concept of CS education.
Which I don't agree with.
--
Grumpy Ol' Fred cisin at
xenosoft.com
On Fri, 27 May 2016, Swift Griggs wrote:
I'm going to loudly agree here. While I find
assembly coding somewhat
tedious, I wonder what kind of Jedi mind trick "Clancy and Harvey" used to
make themselves believe that asm was not only dead but also no longer
useful. *eye roll* Whatever, geniuses. Maybe I'm misunderstanding the
quote? Are they serious? Is this out of context and I'm just not "getting"
what they really meant?
There are all kinds of seemingly instinctual reactions that some folks
have to questions of programming style and efficiency. My *least*
favorites are:
1. The "GUI programming" or "natural language" folks who think that
programming really isn't that hard, the problem is that we haven't given
folks Fischer-Price icons for control structures, or allowed people
"simply tell the computer what to do." I simply call BS.
2. Languages that are supposed to "enlighten" students to some incredible
new programming paradigm, or bolt-ons that to older languages with the
same claims. They almost always start their pitch by telling you how some
irritating or tedious aspect of coding in other languages can be
eliminated or minimized. I'm more and more skeptical of this claim all the
time. It rarely works out and generally making things "safer" or
"easier"
runs a big risk of neutering their usefulness, too.
Those languages who successfully walk the line between power and ease of
use are the ones that survive and thrive (and sometimes it's just
chance/luck as Dennis Ritchie said about C).
The bottom line is that coding is work. It takes creativity, analytical
and critical thinking ability, and probably most important of all:
practice. IMHO, there aren't any shortcuts. You work and you get results.
As tedious as it is, I can think of several contexts were ASM is downright
required. Folks who think there is a magic bullet or shortcut seem to fall
into the same mistakes while calling them something new. Folks who resent
the work & sweat that others do to get those skills are generally the ones
who are screaming the loudest about how programming is really easy, but
it's the geeks who are just overcomplicating things and are making it "so
hard".
Then they or folks with the same mindset generally start talking about
Agile, XP, or some "methodology" that's going to somehow free them from
the basic fact that good experienced coders write the best code and
deliver. You can't simply iron on a methodology and turn a team full of
lazy or careless coders into something else. At best, you can catch more
of their errors and report on the ones that aren't being productive and
hope your management pays attention. I've worked under Agile and XP
regimes and I hate both with a passion. They were both a *huge*
productivity drag (ever actually tried "pair programming"?) and seemed to
me to be an effort to make business weasels feel more comfortable that
their coders were "communicating" and other social crapola they think is
important since most of what they do is sit around and run their mouths in
meetings all day. I'm sure some folks will disagree, but I've *actually
worked* under these schemes. In my experience (and the vast majority of my
co-workers) they were awful. It also seems to me that all the "greats"
(incredible coders) and software projects or companies I loved or
respected weren't "Agile". They simply hired the right people and got out
of the way. Give me the "wizard in a cave" methodology anyday over "1000
H1Bs writing Shakespeare using Agile". Results matter more than
mollycottled business majors and project manager feelings... uhh, IHMO.
Who would you want helping you finish your project, Dr. Jeff Sutherland or
John Carmack? Which do you think is going to get you there sooner and with
better results? I know how I'd answer...
This mentality I dislike is a bit like saying the only reason you can't
play violin is, not because you don't practice and are too lazy to work at
it.... noooo it's because the violin is poorly designed, the wrong brand,
and because you aren't practicing in the right order with your head turned
in the proper direction. Yeah. Right.
I'm not saying the state of the art can't be improved. I only assert that
there are some strategies for doing so that seem flawed from the start
because they start with unrealistic (or downright silly) founding
principles.
-Swift