Getting ethics to catch on with scientists.

I've been flailing lately (most recently in this post) with the question of how to reconcile how science ought to be done with what actually happens. Amidst my flailing, regular commenter DrugMonkey has been doing what some might characterize as getting up in my grill. I'm inclined to view DrugMonkey's comments as pushing me to be clearer and more focused in setting out and attacking the problem.

For instance, in this post on the pros and cons of an ethics class aimed at science majors but taught by a philosopher (me), DrugMonkey comments:

The messenger is irrelevant. This is not the problem. The problem is the message of the "scientific ethics course". Nobody (or at least, very few people) start off in science because it is a great place to cheat, fake data, sit on papers to rush one's own work out, etc. So most people know, at some level, what the ethical conduct is supposed to be. Therefore the "ethics" class which repeats "don't cheat" ad nauseum loses the audience.

The real question is why do otherwise well meaning scientists start to slip down the slope that ends up with outright data faking and other bad behavior? And then continue to self-justify with all the the usual garbage?

It is quite simple. because cheating pays off in this biz and one's chances of getting caught are minimal. Notice how the cheaters who actually get driven out of science seem to be the ones with a huge track record of multiple fakes? Notice how when a lab is caught pretty much dead to rights with fakery, they just get away with saying it was a "mistake" or blame it on some postdoc who cannot (conveniently) be located or vigorously protests the retraction?

Is this cynical? no this is realistic. Does it mean that everyone cheats? no, probably it is still just a minority but really who knows? much of modern bioscience is essentially unreplicable, in part because novelty is so revered. until we get to the point where rigorous, meticulous, internally consistent, replicable and incrementally advancing science is respected more than the current Science/Nature type of paper, all contingencies drive towards bad behavior rather than good behavior.

when ethics classes start to deal with the realities of life and career and the motivations and contingencies at work, well, then they will be relevant. it won't matter who teaches them...

My first reaction to this comment was, "DrugMonkey's preferred approach is how I actually teach my ethics course!" My considered reaction was, "It's time to go right to the heart of the problem and lay it out so clearly that people can't fool themselves about what's at stake."

Which brings us to something that will read a bit like a manifesto.

ASSUME:

  1. All scientists appreciate the need for honesty in reporting scientific findings and the wrongness of fabrication, falsification, and plagiarism.
  2. Despite (1), a certain (alarming?) number of scientists nevertheless engage in fabrication, falsification, and plagiarism with some regularity.
  3. A certain (even larger?) number of scientists are aware of the scientists who engage in fabrication, falsification, and plagiarism.
  4. The known bad actors seem to get rewarded, rather than smacked down, for committing fabrication, falsification, and plagiarism.*

What is to be done?

Here are the responses that probably don't do enough to change things:

  • Reiterate to scientists (in some mandatory ethics training) that fabrication, falsification, and plagiarism are wrong. See assumption (1). The scientists know they're wrong. Any scientists who don't understand the centrality of honesty to the project they're trying to accomplish as scientists -- building an accurate and reliable body of knowledge -- are unlikely to be persuaded by a further round of finger-wagging.
  • Cut off funding to scientists. Scientists can run whatever kind of community of bad actors they want, but they get to do it on their own dime rather than the public's. Surely this is an appealing response to those who value cutting government waste, but even the public might have need for some scientific knowledge. (It's worth noting, though, that the use of public dollars to support scientists who commit fabrication, falsification, and plagiarism is the justification for funding agencies requiring something like accountability to these ethical standards.)
  • Fed up with the bad actors and the lack of real consequences for their fabrication, falsification, and plagiarism, honest scientists quit in disgust. Please, not this one! This amounts to leaving the discipline in the hands of the bad actors. Given that the honest scientists were most likely drawn to science by their love of a certain kind of knowledge-building activity, they'd be letting go of that love because of the people who didn'tlove that knowledge-building activity enough not to cheat on it. That would be tragic -- plus, it would leave the body of scientific knowledge much worse off.
  • Fed up with the bad actors and the lack of real consequences for their fabrication, falsification, and plagiarism, honest scientists stay in science but distrust everyone. This is only a little bit better than the last option. Building an accurate and reliable picture of the phenomena in our world requires teamwork, collaboration, and cooperation. You can't have those without some level of trust.

And here are the two responses I think could actually make a difference:

  • Change the reward structure within the institutions where science is practiced to undercut the ability of bad actors to profit from their fabrication, falsification, and plagiarism. Of course, this option assumes you in a place within those institutions where you have a say in what the reward structures are like. Such people exist.
  • Stop freakin' tolerating other scientists engaged in fabrication, falsification, and plagiarism, and other practiced that undermine the honesty needed to do good science. Start at home, with your friends, colleagues, students, collaborators. Don't be an enabler or a bystander. This is your discipline the bad actors are hurting!

Some of the change, I hope, will come "from above" -- from deans and administrators and CEOs who understand what is at stake in setting up conditions where there's no good incentive for cheating. At least some of these people will remember what it was like to be a practicing scientist, and will still cling to their love of the kind of knowledge-building activity good science really is.

But a lot of the serious work of changing the culture of science must be done in the trenches. It must be done by scientists who give a damn, and who won't stand for someone scooping the heart out of an activity that matters.

It's time to choose a side.

More like this

I choose Drugmonkey's.

By Ian Findlay (not verified) on 01 Jun 2007 #permalink

In the professional world we have a different approach to your problem, it is called a "revokable professional designation", which is required in order to practice at a high level in your field. The problem with academics is that once a Ph.D. is awarded it cannot be taken away. So you need an Association, with appropriate structure and discipline processes to govern the practice of the discipline.

As an example, virtually every Senior engineer out there HAS to be a member of a regional Association of Professional Engineers. When they do work upon which others are expected to depend they must stamp that work with a stamp, issued by the Association, indicating that they are members in good stead. Should they misbehave, the ethics panel of their Association meets and determines if they will revoke membership. If this happens the stamp is revoked and the engineer can no longer stamp reports and thus cannot carry out his/her duties as a professional. Similarly, as a Professional Biologist, should I misbehave I will lose my stamp and then I am kaput.

Change the reward structure within the institutions where science is practiced to undercut the ability of bad actors to profit from their fabrication, falsification, and plagiarism. Of course, this option assumes you in a place within those institutions where you have a say in what the reward structures are like. Such people exist.

Part of the problem is that the most extreme of reward systems reward sexy results rather than good practice.

A solid scientist doing good work, working just as hard as anybody else, but who has the bad luck of working on two or three projects that come up with nothing other than upper limits and negative results during her pre-tenure years will get lukewarm (at best) reviews regarding her research "impact." Meanwhile, somebody who was lucky enough to choose the projects that come up with the paradigm-changing results will be heralded, given additional funding, given various society awards, and so forth.

If we could find a way to reward scientists for doing their science well rather than for being the ones who produced the sexy positive results, that would go a long way towards removing the incentive to fabricate data.

-Rob

So my reaction to the quote from your comments was something like "yeah that's why my mandatory NIH-sponsored ethics class was a waste". I didn't really need a lot of awareness training - which is what I got. I really needed a lot of tools. If I know that something is funky with this authorship case, who/how do I appeal without tanking my career? When authorship isn't clear, let's role play negotiating it to everyone's satisfaction. What should you do when you can't replicate an experiment and you think that original results are a fluke (but maybe not)? What logic should you step through to determine reasonable compensation for participation in research? What if the compensation is non-monetary but highly desirable? When is privacy for subjects not the optimum thing (and how do you convince a review board of that)? How do you handle a cheating case that you can't prove?

Knowing that authorship, privacy, and replicability etc. are important is different than being able to handle them well in the momment.

By hypatia cade (not verified) on 01 Jun 2007 #permalink

Regarding Blair's comment above about the permanence of the PhD degree: Actually, that's not true. It probably doesn't happen often, but I did hear of a German institution stripping one of its graduates of his degree after some scientific misconduct while conducting research in the US. So yes, it can happen and has happened.

I see the same sort of responses in my introduction to philosophy and introduction to ethics students as Drugmonkey's. Making students aware that doing X is wrong is often a waste of time since they already have those moral sentiments. The propblem with their moral sentiments is that they are not valued by students. My students often deny my claims that they have certain ethical obligations; instead, they see ALL of ethics as being supererogatory, much like the position that Glaucon, in the second book of the Republic, says most people hold.

I thus see my job in teaching ethics to be getting students to value ethical behavior--to see it as more important than winning, getting stuff, making money, becoming famous, etc. In other words, I think my duty is try to get students to think differently about ethics itself, before there is any point in discussing what is right or how to figure out what the right thing to do is.

If being ethical is unimportant, if ethical considerations are not more important than other considerations, ethics will almost always be overriden by those other considerations, thus making ethical knowledge irrelevant.

By Leslie C. Miller (not verified) on 01 Jun 2007 #permalink

bdf,

Here is the story you were thinking of (cited from the Chemical and Engineering News June 17, 2004):

"Jan Hendrik Schöön, the former rising-star physicist who was fired from Lucent Technology''s Bell Laboratories in 2002 for falsifying research data, has been stripped of his doctorate. The University of Konstanz, in Germany, announced that although there is no indication of research fraud in connection with Schöön''s graduate-school work, it has revoked the Ph.D. degree that the university awarded Schöön in 1988.

In a statement posted on the university's website, physics department Chairman Wolfgang Dieterich notes that this type of punitive action is in accord with local law in Baden-Wüürttemberg, the southwestern German state in which the university is located. The law provides for the possibility of withdrawing a university degree for improper behavior carried out after the degree has been awarded, he explained."

Please note that this was only allowed because that one jurisdiction had a law allowing it to happen. In most jurisdictions (including my own of BC) the applicable act does not include text allowing for a degree once granted to be revoked.

It is my understanding that a degree can be revoked if misconduct is discovered in the qualifications of a degree, i.e. if you carry out misconduct in the research used to get your degree they can revoke your degree. If you carry out misconduct after you graduate they can't take away your academic credentials in most jurisdictions.

Okay, I'll bite.

With respect to your "grill", well, I can see where by "the messenger doesn't matter" I may have come across as a tad dismissive of your role. This was not my point and actually I'd be delighted to see a little more in the way of philosophers-of-science interacting with "us" across universities.

My perspective on science ethics training is shaped entirely by the process as we do it for ourselves. I've not had the pleasure of seeing the pros get engaged in the process. I imagine this is the experience of the bulk of scientists who have undergone "ethics training" experiences.

DrugMonkey, to the extent that you might have been perceived (by me) as getting in my grill, chalk it up to a reflexive identification I make with the "ethics people" -- but I agree with you, there are a lot of people who have a regulatory/legal/finger-wagging approach that turns people off.

By my reckoning, you and I are on the same side here. Your comments push me to be clearer about stuff that matters on the ground.

When you teach ethics to scientists in training, how much do you talk about the history of science?

I think most people enter science for altruistic motives, each of which could probably be categorized as 'improving society' or 'seeking the truth.' It's only along the way that we are introduced to financial gain, advancement requirements, prestige, funding pressures and the other incentives to lie, cheat and steal.

If we think about the development of empiricism and the scientific method, we can identify that the small number of people who were natural philosophers (there weren't even 'scientists' then) were in it for the altruistic reasons. Am I sugar-coating this to say that there was not motivation for scientific misconduct?

What I am getting at is that it might be useful to contrast the historical conditions surrounding the rise of the scientific tradition in relaying why integrity is such an issue today. We can all identify the pressures that lead to lapses of integrity, but fewer of us admit to temptations and I think fewer of us will (even anonymously) report actions of falsification or plagiarism.

Maybe it's altruistic to appeal to Tradition and Foundations. I think if more people held honesty as the scientist's prime directive, we would not see as many problems with scientific integrity.

I believe that appealing to the ethical principles most future scientists (grad students) already have will not solve the growing problem of scientific misconduct. The percentage of misbehaving people among scientists is not different from that of misbehaving people in the general population. Those in the first group are (may be) more sophisticated than those in the second group however, the reasons for misbehavior in both groups are probably similar. The main difference between misbehaving scientists and misbehaving people in the general population is the system and the rules of handling them. It is clearly easier to misbehave in science without being caught or without being punished in proportion to the gravity of the misconduct. Misbehaving scientists do not have to worry about ethics police; they do not have to worry about heavy-handed judges in a strict court system; they do not have to worry about fellow scientists boycotting them when they are caught misbehaving.

Only when unethical behavior in science is going to be punished harshly both by the behaving scientists and by society in general, that we could begin to change the tide of misconduct that afflicts our vocation today. As long as scientific misconduct is looked upon as a traffic violation rather than a rape, the tide of scientific misconduct will not abate.

As a member of the Society for Neuroscience, I filed a complaint with the society's officials (all very prominent scientists) 9 years ago about the scientific misconduct and violation of the rules of the society by one of its members. No action whatsoever was taken against this member. When I pressed on, I received a letter from the society's secratery on an official letterhead explaining to me that despite the society's rules, getting involved in legal haggling over this matter is too expensive and beyond the society's financial capabilities (this society is one of the richest in the ountry. Three years ago they have invested millions of dollars building their own five-story high headquarters in Washigton, DC). You can find the details of this case here:
http://www.brownwalker.com/book.php?method=ISBN&book=1581124228

By S. Rivlin (not verified) on 02 Jun 2007 #permalink

Academic research is plagued by a lack of consequences for those who are in charge. This lack of consequences has lead to many important areas of the research process being neglected, ethics is one, mentoring is another. Because of the hierarchical structure of the academic system the one who exposes the problem has a more severe consequence than the one who is doing something wrong, students and post-docs can only be a whistle-blower if they want to sink their own career and those of everyone in their lab for the chance that their advisor might get in trouble. I believe this is the root of the problem.

First, I would echo Blair's comments about self-regulation and sanctions. Pretty much every profession has some sort of written code of ethics, and a mechanism by which an individual can be judged against it, independently of the the employing organisation and sanctions imposed. There seems to me to be no prima facie reason why scientists couldn't, or shouldn't, or don't, have the same.

Second, I would like to know, out of the population of cheating scientists hypothesised by DrugMonkey, what proportion straighforwardly and deliberately set out to cheat (because of the percieved rewards) and what proportion gradually got sucked into it, probably because of a passionate belief in the correctness of their own hypotheses and an end-justifying-the-means frame of mind. And/or, because they did a tiny little cheat once and no-one noticed, so neatening up data in a ´harmless´way gradually became a habit etc. My bet is the second category is a lot larger; this would suggest that ethics classes might include some of the considerable psychological literature about people's ability to fool themsleves, and some practical suggestions about how to double-check that you aren't.

By potentilla (not verified) on 03 Jun 2007 #permalink

Potentilla,

Do all criminals born criminals? I believe that criminals are sucked into crime for many different reasons, the majority of these reasons, at least to begin with, are minor or insignificant. For years scientists have foolishly believed that they have a code of honor without which science cannot work. We also continue to believe that somehow the scientific method will root out the cheaters. This is of course not true. The only thing that will be rooted out, sometimes many years after the cheater is gone, are his/her fake results. Stolen projects, plagiarism and small neuances of fabrication and falcification are never discovered. Misbehaving scientists, just like regular criminals, become more daring as their misdeeds are remained hidden and unknown. As the (cheating) scientist's name get bigger and the money and power are accumulated, he/she is better equipped to continue to cheat and better hide his/her cheating. The word of a big name scientist is more trusted and believed than the word of a disgruntled graduate student or postdoc. Many of the rules and tactics of the mafia are also those of the scientist godfather.

I know all this sounds somewhat farfetched, but seeing it and fighting it continuously for the past 10 years, I can attest to and prove all of it. Unfortunately, indicting cheating scientists for tax evasion will not be their downfall.

By S. Rivlin (not verified) on 03 Jun 2007 #permalink

The difference between engineering and some of the examples from science (senior person blames it on grad student or post doc) is that engineering punishes the person who signs the plans, regardless of who made the mistake. Thus, in the famous case of the collapse of the walkway in the KC Hyatt, an error made by a junior field engineer resulted in the most senior engineer in the company losing his license. And with cause: the work was accepted based on his reputation and he failed to check the work and detect the error.

The attitude of senior scientists about what goes on in their lab would change if that rule applied in science.

By CCPhysicist (not verified) on 04 Jun 2007 #permalink

S Rivin - sorry, I don't really grasp what point you're making in your reference to tax evasion.

Many professions are self-regulatory; that is, to be allowed to call yourself an X-accountant, you have to jump through whatever educational hoops the x-accountancy institute specifies, and also adhere to their code of ethics and professional behaviour. Anyone can complain to the x-institute if they think you have breached the code (including not supervising your juniors as CCPhysicist mentions), and the x-institute will investigate, and may take away your right to call yourself an x-accountant. This may be a caveat emptor issue (ie you can employ a not-x to do your books if you really want to) but, particularly in respect of public organisations, there is often a law saying that you can ONLY employ xes in certain roles.

A variation on the above is where the rules of competence and ethics are set by a statutory body of some sort, which can legally prevent you from doing certain kinds of work at all if it wants to (subject to some sort of appeal process). This is common in financial services. For instance, in the UK, you can only work as a fund manager if you an approval from the FSA; if the FSA withdraws its approval and your tribunal appeal fails, if you do any of the things falling under the definition of working as a fund manager, you are breaking the law (and you probably won't even get that far, because no-one will employ you, because if they did THEY would be breaking the law).

Your employer can't in any way stop these processes happening. It may also discipline you internally. It may itself also be disciplined by the external body (statutory or self-regulatory), for instance for not supervising you properly.

This is all serious stuff for professionals. People regularly get their livelihood taken away, just for being careless let alone deliberately fraudulent.

AFAIK, scientists on the other hand are only subject to internal disciplinary processes, by their employers, who are conflicted from the word go because their institutional reputation is at risk.

By potentilla (not verified) on 04 Jun 2007 #permalink

CCPhysicist: This would radically reshape modern bioscience. There are two areas in which the PI (principal (or principle as some have it) investigator; lab head) is potentially unable to oversee cheating.
First, is the professional independence extended to the scientific staff wherein it is insulting in many cases for the PI to micromanage every little experimental result. Of course micromanagement would permit the additional oversight... but in terms of time wasted? in terms of reshaping the way postdocs and grad students are culturally assumed to be participating scientists, not just a pair of hands? the point is that there are costs and reshaping the whole scientific enterprise to prevent the minority(?) of cheats may not be a good idea. Examples of what can be accomplished when vibrant, really independent (and good) postdocs are around versus what is(n't) accomplished with dull tell-me-what-to-do and/or phone-it-in postdocs show me that micromanagement is not the way to go.

Second, in the current trend for big labs bringing many models and approaches to bear on one topic, well, PIs are increasingly incompetent to perform micromanagement of the research. They don't really understand the day to day benchwork of each and every assay that ends up in the high impact paper. There are many implications here. This understanding could facilitate turning a blind eye to fakery. It also suppresses the potential development of the smaller, more honest PIs lab because of fear of bringing in a postdoc to add a new component to the research group (how do I know they are any good? how will I know they aren't faking data, ah screw it, not worth the risk). Maybe BigScience isn't the way to go and we should be retreating to a smaller lab-group model? There are arguments on both sides of this one.
However, with respect to your most essential point, yes, I'd like to see a little less facile blaming of the peons. In the cases of formal investigation it should be asked routinely whether a given PI fosters a climate in which fakery is encouraged. In paper correction / retraction / errata I think the corrections should be re-reviewed preferably by the same reviewers. This latter is important because we could probably use a system by which papers are retracted not for outright cheating but because it only rose to the level of being accepted in a high falutin' journal because of the original package and the "erratum" or "correction" means that it would have never been accepted in the first place. This would start decreasing at least one motivation of the PI to turn blind eye.

potentilla,

I apologize for my wierd sense of humor. The tax evasion remark was in reference to Al Capone. We are in agreement on most of the issues. However, most scientists will fight tooth and nail against installing an ethics police force since they believe that ethical behavior is intrinsic to being a scientist. Of course, we all being searched before boarding a flight at the airport, although the great majority of us are not terrorists. Eventually, scientists, too, will have to face the grim reality that their vocation is contaminated with bad apples and the only way to get rid of them is to check the whole basket.

By S. Rivlin (not verified) on 04 Jun 2007 #permalink

Ethics police force sounds a lot like moral majority, and I sure don't like the sound of that. I do hope we scientists manage to get our skit together well enough to avoid ever having to resort to that kind of enforcement. Grim reality is not inevitable. Just as empowered citizens can stop their government from curtailing their civil rights on the pretext of protecting them, so we can do something about the situation if we start giving a damn.

When the rules of fair game are clear and known, policing the players is no problem. When the rules of the game are not clear or when they change continuously then, rule breakers get away winning the game unfairly. In most academic institutions today, the referees of the game are administrators who do not give a damn about science itself. For them the bottomline and their own positions are of greater importance; for them, protecting the reputation of the university they are in charge of, is number one task, even if this means covering up scientific misconduct. Read this book and you'll see:
http://www.brownwalker.com/book.php?method=ISBN&book=1581124228

By S. Rivlin (not verified) on 04 Jun 2007 #permalink

Getting ethics to "catch on" with scientists? I think you're presuming just a bit, here.

By Caledonian (not verified) on 04 Jun 2007 #permalink

iGollum,

You shouldn't think of ethics boards for professional organizations as "ethics police" but to keep with the constabulary terms consider it like a coroner's inquest. The point isn't to go out and look for crimes (like a police force) but rather to evaluate failures (or complaints) to determine if negligence or unethical behaviours were contributing factors.

In an academic sense you could think of it as another form of peer review. Scientists are used to submitting to peer review for their papers and all your ethics committee for your profession does is to serve as a peer review of your professional conduct in cases where problems have been brought to their attention. Only in the case of ethics reviews you (as the subject of the review) get input throughout the process and a chance to defend yourself before the review is complete.

As a registered professional I am comfortable in the knowledge that any substandard work with my name on it will be reviewed. Because of this I WILL NOT sign my name to work unless I know that the work was done correctly and meets the standards of my association. As for Drugmonkey's comment, any PI who knowingly adds her/his name to a paper without being able to demonstrate the competence to understand the underlying work should find alternative employment.

Regards,

Blair you conveniently illustrate the point. It is all very well and good to take the high horse approach to what "should" be the case. If such an approach does not match reality, it is going to be dismissed by the target audience. My overall point is that to effect change we need to get down and grapple with the realities of what IS not what one thinks should be the case. Modern bioscience is a collaborative effort. I can make numerous cases for papers in which the science is great but could never have been conducted under strict rules of "every contributor must be expert in every other assay/data/model reported in the paper".

As long as the final decision of whether or not to pursue an investigation of scientific misconduct is in the hands of administrators rather than in the hands of scientists, uprooting scientific misconduct will fail.

Thanks, Blair, for the clarification. Didn't realize there were external regulations that factored into it.

An additional ethics question that urks me is the principal investigator who is having a relationship with his post doc or student. This was the topic I found most useful in my ethics course. One example, not in US, is resulting in this women moving ahead in science at a very unusual rate by all standards. Quickly got a job at a near by institute, received very quickly a head of a cancer group at the University. But continues to publishe with her mentor and he reviews all her ongoing work. Showing lack of independence. I guess everyone wishes they had such support. Just another example of a field in that bad behavior does get rewarded.

By Claudia Karen (not verified) on 28 Jul 2007 #permalink

Congrats on getting selected for OpenLab2007 Janet!

(and dang, I'd better watch the comments, didn't know it would be, er, ....elevated to such heights. )