Buy-in and finger-wagging: another reason scientists may be tuning out ethics.

I was thinking some more about the Paul Root Wolpe commentary on how scientists avoid thinking about ethics, partly because Benjamin Cohen at The World's Fair wonders why ethics makes scientists more protective of their individuality than, say, the peer-review system or other bits of institutional scientific furniture do.

My sense is that at least part of what's going on here is that scientists feel like ethics are being imposed on them from without. Worse, the people exhorting scientists to take ethics seriously often seem to take a finger-wagging approach. And this, I suspect, makes it harder to get what those business types call "buy-in" from the scientists.

The typical story I've heard about ethics sessions in industry (and some university settings) goes something like this:

You get a big packet with the regulations you have to follow -- to get your protocols approved by the IRB and/or the IACUC, to disclose potential conflicts of interest, to protect the company's or university's patent rights, to fill out the appropriate paperwork for hazardous waste disposal, etc., etc. You are admonished against committing the "big three" of falsification, fabrication, and plagiarism. Sometimes, you are also admonished against sexually harassing those with whom you are working. The whole thing has the feel of being driven by the legal department's concerns: for goodness sake, don't do anything that will embarass the organization or get us into hot water with regulators or funders!

Listening to the litany of things you ought not to do, it's really easy to think: Very bad people do things like this. But I'm not a very bad person. So I can tune this out, and I can kind of ignore ethics.

The decision to tune out ethics is enabled by the fact that the people wagging the fingers at the scientists are generally outsiders (from the legal department, or the philosophy department, or wherever). These outsiders are coming in telling us how to do our jobs! And, the upshot of what they're telling us seems to be "Don't be evil," and we're not evil! Besides, these outsiders clearly don't care about (let alone understand) the science so much as avoiding scandals or legal problems. And they don't really trust us not to be evil.

So just nod earnestly and let's get this over with.

If ethics is seen as something imposed upon scientists by a group from the outside -- one that neither understands science, nor values it, nor trusts that scientists are generally not evil -- then scientists will resist ethics. To get "buy-in" from the scientists, they need to see how ethics are intimately connected to the job they're trying to get done. In other words, scientists need to understand how ethical conduct is essential to the project of doing science. Once scientists make that connection, they will be ethical -- not because someone else is telling them to be ethical, but because being ethical is required to make progress on the job of building scientific knowledge.

More like this

Aren't ethics are always imposed externally? I mean, isn't what scientists see as ethical in large part dictated not only by universal morals/ethics (which may be internalized), but also by the current views of society on what is good and ethical?

I remember the ethics class ("class" is a bit generous, it was about 2 hours a day for 3 days) I took as a requirement in grad school, and I've always thought that it was something of a wasted opportunity. After going through the packet of regulations, much of the class involved descriptions of really outrageous, over the top examples of fraud.

My favorite was a story of a scientist who immigrated to the US from Romania (if I'm remembering correctly) who for years was translating papers from a Romanian journal into english and publishing them in western journals under his own name. He was finally caught when another Romanian scientist reading those same journals recognized his own paper with someone else's name on it.

Stories like that are fun to listen to, but if you're already crooked enough to commit blatant fraud, hearing someone say "...by the way, don't fabricate all your data" probably won't make any difference.

The time would have been better spent on examples of "slippery slope" type situations in which a basically well meaning person might nonetheless drift over the line before realizing they've done so.

I thnk you really nailed it this time. That is exactly the sentiment voiced by scientists who are forced to attend these seminars - including myself.

being ethical is required to make progress on the job of building scientific knowledge

I would argue that this is only true because, if you want to build scientific knowledge, you must rely on the support of the community (society) that decides what constitutes "being ethical", not because there is any deeper connection between "ethical" and "scientific".

Surely honesty (which I take it falls under "being ethical") is necessarily connected with building an accurate and reliable picture of the world (which is what science is trying to do).

My experience is that most people don't know how to approach questions of ethics in the first place. They compare good outcomes of one choice with bad outcomes of the same choice, or good outcomes of one choice with bad outcomes of the other. Ethical questions can't be quantified, but qualitative decisions can too easily be justified away when you compare apples with oranges this way. I tell my students, or others whom I've advised on ethics, to make a list of good outcomes and bad outcomes for each of the two aspects of the choice they have to make, and then compare good with good and bad with bad in a reflective manner. A decision that actually seems right to them tends to arise naturally, and this usually also reflects the societal norms of right and wrong.

Ralf

I reckon one of the problems with scientific ethics is that many of the guidelines are not falsifiable, and do not appear to have been derived using a scientific method. As professional scientists, it is our job to distrust anything that has not be subject to experimental verification.

There's also the question of "agenda", as per Bruce Schneier. When ethics are handled by groups from within science, their agenda is as you (Dr. FR) say. When it's coming from outside groups, the agenda all too often is to "get those scientists under control". Especially under the current administration, where any enforcement from "outside" is more than likely to be politically biased.

By David Harmon (not verified) on 26 Jun 2006 #permalink

Philosophers have to accept our share of the blame here. We are presented in today's popular discourse with two models of ethical argumentation: (1) the absolutist, moralizing, I'm right/you're evil jerk, and (2) the shoulder-shrugging, who's to say, everyone is entitled to his own opinion, relativist buffoon. Most people look at moral dicourse and figure they either have to commit to being a jerk or a buffoon, or just take a pass on the whole game. In the division of intellectual labor, part of our job is stepping up and providing another model of discourse, an intelligent, thoughtful approach that can come down hard on fallacy-ridden nonsense, but still allow for the existence of moral doubt and provide clarity in difficult cases. I think that our lab-coated colleagues are responding in part to our failure.

Steve G, I should have added you to my blogroll earlier, but that comment clinches it. There's a clear parallel here, to my mind, with the way that scientists complain about a scientifically illiterate public, and yet do little or no outreach, or even "public intellectual" style communication. I wonder whether part of the answer, in both cases, might be blogs?