First let me say "Wow, what a thoughtful and detailed response." We might
not agree, but at least you are civil about it and I appreciate that.
On Mon, 25 Apr 2016, Brian L. Stuart wrote:
Certainly a lot of people do view it that way, but
it's not what I was
getting at or how I see it. Based on my experience, the virtues of any
single language are pretty much irrelevant.
I couldn't agree more. Learning patterns and algorithms is much more
helpful since they can be broadly applied. For example, I learned
Quicksort in Pascal, but I've implemented it in C many times, without
having to refer back to any source.
To tell you the truth, I'm not very likely to hire
anyone who isn't
conversant with at least half a dozen different languages.
When it comes to hiring experienced coders, I couldn't agree more. I get
resumes sometimes where the person focuses only on one language. Those
generally migrate to the bottom of the stack. I code in eight languages
myself. Three of those I'm pretty strong in, the rest I get by in or used
to be good at but faded. However, there is certainly a process of discovery
every time I learn a new one. I make sure to learn new languages on a
regular basis. Sometimes I don't use them and I forget, other times I throw
them away with burning hatred. However, I usually always learn something
new.
I did understand the point in your first message to be
anit-"Ivory Tower
Academics." However, my point it is that viewing the people you have
identified as such and dismissing their experience and expertise is a
narrow-minded and short-sighted perspective.
It's probably a bad idea to dismiss anyone's experience when you haven't
"walked a mile in his moccasins.", including mine. Though my attempt may
have been inarticulate, I was talking about my own experience in academia
and not trying to pick a fight with every LISP coder on the planet. If I
was more clever, I'd have probably had the foresight to say simply say
$academic_only_language instead of using the pit-bull attack trigger word:
LISP.
It's interesting that you pick that age as the
example. My daughter is
23. For her, the undergraduate experience wasn't about a job at all. It
was about exploring the intellectual world
Well, I wish I had such a view of the world. However, the urgency of making
a living without parental support (and there was zero for me) was paramount.
I didn't want to graduate with many thousands of dollars of student debut so
I could explore the intellectual world (no sarcasm intended). Personally, I
find it much easier to do that on my own because without the weight of the
grade or test, I can just relax and learn what I want, not what someone else
thinks is important for me. Your daughter will likely have the benefit of
profs who are delighted to teach her since she sounds delighted to learn.
and (to borrow from Thoreau) sucking the marrow out of
that life.
I've read Walden, and I have to say, Thoreau isn't a big fan of the
academia, either. He rails against universities and their methods of the
day.
In the interest of full disclosure, however, I should
point out that she's
not typical of most college students (although I wish more were like her).
As cool as her attitude is, no she's not at all typical in my experience.
She did grow up in a household that averages more than
two degrees per
person and she did triple major in her four undergrad years. (A proud
daddy can't help but brag a little. :) )
It probably greatly influenced what she thought she *should* get from a
university (her experience growing up with you as a dad). I didn't have that
kind of upbringing.
I apologize if I misinterpret, but I also detect the
suggestion that they
are supposed to. I don't disagree that they don't train, but I do
disagree that's what their purpose is.
I don't disagree that they are not here to train, but I sure as heck think
they ought to consider the value of doing both simultaneously, or focusing
*some* on training. At least in the USA, the middle class is shrinking.
That factors into the "urgency" you mention - meaning there will be more of
it. The skyrocketing costs also jaundice the eye of folks who listen to
this kind of debate. Ie.. if you have to pay a fortune, *some* ROI becomes
a concern.
I'm not suggesting that some degree of training
coming along with the
education is a bad thing. However, I'm saying that's not the primary
purpose of the university.
I would totally agree. If I had my way universities would be very different
places. However, it's not my way, but I don't have to like it.
In particular, if I have a candidate sitting across
the interview desk
from me, I'm not interested unless they have both education and training.
You'd miss a lot of stellar folks, then. Only about half of the "genius"
programmers I've known had a degree in anything related to CS (or at all).
Not to mention that by this standard many famously intelligent or talented
folks who had training, but little or no university education wouldn't get
the nod.
I expect the education to come from a formal
environment where people of
long experience can help the student understand many perspectives.
Sounds like the old "well rounded" argument for college. However, what I
think you aren't seeing is the intersection of nasty economics and this
purified idea of what a university should be about. They are on a collision
course and I'd point to the bifurcation of universities already. You have
your office-building University vs your greek-frat campus university. You
seem to be advocating for the latter.
I expect the training to come from self-directed
experience. Unless a
candidate shows both the ability to work in a rigorous intellectual manner
and the self motivation to go beyond what they've been given, I'm not
interested.
I've been personally involved in hiring around @200 people during my career.
I got stuck doing many interviews when I worked at IBM and Oracle (about 10
years between the two). Personally, I couldn't have given a rats behind
about their education. I only cared if they could pass my tech screen and
get through the panel-based technical interview. Nobody wants to coddle a
sub-par IT guy while everyone else struggles to shoulder their share, no
matter how well rounded they are.
However, more often than not, the ideas that were seen
as "new" in the
'80s, '90s, '00s, and '10s, are really ideas that the Computer Science
community saw, studied, understood, etc in the '50s, '60s, and '70s.
Agreed, but taken to an extreme, it's still not helpful. Using material
that's experientially relevant to the student should still matter.
As it turns out, I am currently involved with a
restructuring of the
introductory programming sequence at one university.
Then, may you choose well.
No one or two languages will give the breadth and
depth needed
pedagogically.
Perhaps, but they might get ahead in of someone else as they exit the
university because you taught them something more commercially relevant,
even though it's not what you see as your job.
Neither will any one or two languages suffice for
building
a career as a computer scientist.
Most don't want to be a computer scientist, they want to be IT managers,
coders, DBAs and sysadmins, because those are the bulk of IT jobs. Just
keep in mind few will ever get to become a "Computer Scientist", unless you
mean that in a very general sense.
There are expected minimums, certainly. Based on my
experiences in both
academics and industry, I would have my doubts about whether someone is
really cut out for a CS major if they can't average Bs in their major
classes.
That seems reasonable, as long as the program was fair and the students were
engaged. When those things go off the rails, as you hear about with some
for-profit universities, it doesn't seem as cut and dried.
seriously, I need to see some evidence that learning
new things is a
higher priority to them than the grade.
I look for a signs that they have a true love for tech or tinkering. If
they couldn't be kept from taking things apart as a kid, that's a good sign.
If they read technical things as kids and lusted or near-worshiped some
type of machine, code, etc.. A love for learning combined with an "applied"
character is perfect.
I teach is to help each student move as far along that
continuum
as they are capable.
Too bad I didn't have more professors with that attitude.
Many would classify that perspective as "Ivory
Tower." They might say, I
can't put food on the table with that attitude. I would differ with that.
That's not something I would say, my definition of Ivory Tower is when
academics refuse to recognize that reality has superseded their expectations
that the world conform to the breadth of their ability to teach it. Also
willfully refusing to teach anything practical simply to stay "meta". You
can do both, I think and it's not an unreasonable expectation.
Throughout this, it has not been my intention to in
any way
dismiss your perspective or to suggest that it is a "wrong"
perspective.
Thanks.
Indeed, it's a perspective I'm quite familiar
with. Instead, my objective
has been to suggest there's another perspective whose consideration might
lead to deeper understanding.
Duly noted.
It is a perspective which attempts to temper the
immediacy of the question
of tomorrow's employment with the longer-term view of how that employment
fits into the thousands of years of human civilization.
You bring the torch for civilization. I'll hunt us up some grub. Maybe we
both can live in the world, eh?
-Swift