I'll Try One More Time About Biomed Training: It's About Job Skills, Not 'Critical Thinking'

Clearly, I'm suffering from instructor error here, but I'll try it one more time. Back in my parents' day, mothers told daughters, "Learn how to type." Because one never knew if you might have to go it alone, and accurate, rapid typing, at the time, was a genuinely employable skill in demand. Remember is that most biomed Ph.D.s won't wind up in academic tenure-track jobs, so, like it or not, employment prospects out in the real world matter. We're failing our Ph.D's and post-docs if they wind up with the equivalent of English Ph.D's, especially in light of the hundreds of thousands of federal dollars spent training a Ph.D. It's not that DrugMonkey is wrong in calling for Ph.D's to develop "an understanding of a literature in your own mind" or that Chad is wrong in arguing for the development of "critical thinking skills."

And every job description does tell candidates they should have critical thinking skills--along with all the other hooey: being 'proactive', able to work in teams, and so on (actually, I'm convinced many bosses and managers don't want too much critical thinking--it might endanger them). This include job descriptions I've written.

And it's all horseshit.

Those job ads also call for specific experience and skills--and these skills are often technical and not learned on the job. Critical thinking doesn't really enter into it. It doesn't hurt, and when employers have to choose among several people, those nebulous critical thinking skills don't hurt. Might even help.

This is where biomed Ph.D's get hurt 'out there.' They are very well trained and, in terms of accreditation overqualified, but often lack general technical skills (e.g., programming) that a variety of employers require. And employers usually will hire the person with those skills over the generally smart person.

This doesn't mean that we shouldn't be concerned about the intellectual development of biomed Ph.D's: I do take very seriously the concept that Ph.D's aren't highly specialized lab techs, but are doctors of philosophy.

But, as they used to say, learn how to type.

An aside: I think the prospects for 'terminal' undergraduates and masters recipients are pretty good. Good biomedical workers are valuable. But you don't need a Ph.D. for those jobs (and typically, if you have one, you won't get hired).

More like this

So what are the skills that are going to be most useful? What is this generation's 'learn to type'? Is it programming? Is it still computer related? What makes up general technical skills?

I'm generally curious and I'm hoping you can offer some more specifics other than what you already stated. Thanks!

You hit on one of my (many) pet peeves: the lack of distinction between "education" and "training." Lately "training" seems to be used to cover both, and we're losing out as a result.

The product of "education" is understanding. The product of "training" is skills. A doctoral program will, necessarily, include both -- a deep understanding of the field which allows the individual to see relationships and propose hypotheses which someone less learned would miss, and a set of skills (if nothing else, how to research the state of the art and how to teach oneself.)

Both are essential, both in academia and in industry. Currently, we talk about the products of a PhD program the same way that industry does job searches: we focus on "training" even when we're looking for someone with understanding. I wonder to what extent the academic emphasis on "training" is the result of the pressure to support the institution through production programs. Dunno -- not my field.

I do know that in my industry at least, as an interviewer it's as simple as this: even when we're looking for understanding, we assess skills -- because they're easier to assess. I can evaluate someone's skills pretty easily by looking at the work they've done: a lack of the necessary skills will simply kill the project. A lack of deep understanding, on the other hand, would require me to dig deeply into the actual mental processes that were involved and quite possibly into the subtleties of the analysis that they applied to the results -- and if nothing else, that's a honking sink of time I am (like all of us) far to short on already.

By D. C. Sessions (not verified) on 04 Apr 2011 #permalink

Just got back from my new job orientation. Here are the job skills that got me hired and ranked well above the folks fresh out of academia:

1. Solved several problems of relevant interest to the industry in question. These were all applied science problems requiring a generalist viewpoint, not basic science. When I was in grad school, this viewpoint was considered overly broad and lacking depth. I was also considered Not A Real Scientist in academia because I was interested in applied science. I was also considered, by some academics, Not A Real Engineer because I worked largely from massive empirical data sets rather than calculating from first principles. In industry, this was a plus because it showed I was results-driven.

2. Previous work in industrial tech transfer through the various stages of IP development, all the way to product commercialization. In academia, tech transfer was something a secretary did--she transferred your paperwork to an office somewhere across campus, and then you were supposed to get on with life.

3. Field work. I did lots of field work, then analyzed my own samples, because we were perpetually short-handed in my last job. It was a big deal that I was willing to get my hands dirty and deal with real world problems in real time, rather than lab samples in a freezer.

I've seen other people get hired for their first gig out of school being consulting for BCG or some such. Other than that, yeah, you're sorta stuck looking for a lab that does a lot of industry-oriented projects and applied stuff. Which are thin on the ground, at least they were 5-10 years ago when I was in school.

I'll admit it Mike, sometimes the instructor error does kick in - and the reader error too - because I wholeheartedly agree with you that employable skills are important, but I can't quite tell from your link to me if you reading my post as *advocating* that science PhDs be "the equivalent of English PhDs." Hopefully not; I specified in addition to writing, teaching, etc., that programming and stats expertise are broadly transferable skills that grad students should try to obtain. So I think we agree. Right? :)

This is a bit more of a general point, but it has been bugging me for a while...

When the hell did it become normal/acceptable for an employer to assume the educational system is supposed to provide a pool of potential employees *trained* for their *specific* technical needs? And by "trained" and "specific", I'm talking about significant experience and or certification of particular techniques, equipment, and/or software.

I remember a collection of computer programming job 'requirements' someone had put together where the years of experience being asked for were longer than the years the systems/languages had existed. Yeah, a funny example of clueless HR / management types, but also a symptom of a particularly pernicious disease IMO.

With respect to many (probably most) employers, there is no equivalent to learning to type... They want 160WPM with 2 years on an IBM Selectric III with additional qualifications using small font typeballs.