I agree that whether a student learns has much more to do with the student
than what in particular they're studying.
I quit my undergraduate physics degree when I had a moment of clarity that
even if I managed to squeak through my Partial Differential Equations class
with a C (I did) I'd still be on a trajectory where _solving PDEs was what
I would be doing with my life_.
My undergrad degree is in Ancient Mediterranean Civilizations. My MA is in
History (but, hey, the History of Computing). I dropped out of the Ph.D.
program I was in for a variety of reasons, to be honest crushing depression
probably chief among them, but also because I was a fairly decent
practitioner, and I had more fun playing with computers than I did writing
stories about people who'd played with computers a couple generations
before I had. It being the late 90s, my job prospects were decidedly
better on the Not A Professional Historian side of the fence.
That was 22-ish years ago. I'd been making beer money all through college
and grad school with IT jobs, and I've stayed in IT-related fields ever
since. Consulting, systems administration and engineering, these days
software development-but-also-devops. My lack of appropriate degrees
probably only didn't hurt because I started a not-unsuccessful consulting
business with my mentor after I quit grad school, and by the time I was
ready to move on from that, I had enough years of broadly varied experience
under my belt that it didn't really matter.
But that's tangential. The actual point was: the fuzzy stuff is only
contemptible if you've got Physicists' Disease. History is hard, and it
has a lot more in common with debugging that is obvious at first glance.
In both cases you are presented with "Here's what happened," and it's
your
job to figure out "why?" In both cases, the ability to break the end-state
down into a set of much smaller components which contribute to it is
key--and although that ability probably _does_ have a lot to do with innate
personality or preference, it's also very, very much a learned (and
trained!) skill. The thing with debugging is that you usually are afforded
the opportunity the repeat the experiment while changing parameters. With
history, you're not so lucky, and thus you never _really_ know the root
causes (you also usually don't know "what really happened", but by
cross-referencing your sources you may be able to emerge with some sort of
decent-consensus guess), but you can make more or less plausible and
persuasive hypotheses about them. The idea that you are interrogating a
recalcitrant witness is common to both history (and I would guess many of
the humanities and social sciences) and creating and maintaining working
software.
I'll grant that it's _harder_ to surf blithely through an engineering
degree than it is through a liberal arts degree, mostly because there are
less-subjective metrics (particularly in lab courses) as to whether you
actually learned something about what you're supposed to be doing. If I
were being snarky, perish the thought, I'd say that the engineers who
didn't really want to understand what they were doing were pre-meds.
Nevertheless, a degree--and particularly an advanced one--is indeed much
more about the discipline to put your head down and swallow what's put in
front of you than about smarts.
I was told a couple of decades ago I'd regret not having stuck it out for
my Ph.D. I'm still waiting for that regret to kick in; in the meantime,
many others have come and gone.
Adam