What is 'Investigative Science Journalism'?

Background

When Futurity.org, a new science news service, was launched last week, there was quite a lot of reaction online.

Some greeted it with approval, others with a "wait and see" attitude.

Some disliked the elitism, as the site is limited only to the self-proclaimed "top" universities (although it is possible that research in such places, where people are likely to be well funded, may be the least creative).

But one person - notably, a journalist - exclaimed on Twitter: "propaganda!", which led to a discussion that revealed the journalist's notion that press releases are automatically suspect and scientists are never to be trusted and their institutions even less. That was a very anti-science sentiment from a professional science journalist, some of us thought.

This exchange reminded me of a number of prior debates between the traditional Old Media journalists and the modern New Media journalists about the very definition of 'journalism'. The traditional journalists are fighting to redefine it in a narrowest possible way that keeps them in a position of gatekeepers (like the new proposed shield law that defines a journalist as someone who gets paid by the Old Media organization, thus NOT protecting citizen journalists, accidental journalists, bloggers, etc.), while the new ones are observing the way the world is changing and trying to come up with new definitions that better reflect the world (and often go too far in the other direction - defining everything broadcast by anyone via any medium to the audience consisting of more than one person as journalism, including the crossword puzzle in a newspaper and the silliest YouTube video).

One of the frequently heard retorts in the "you'll miss us when we're gone" genre of defensiveness by the old guard is the slight-of-hand in which they suddenly, in mid-stream of the discussion, redefine journalism to equate only investigative journalism. This usually comes up in the form of "who will report from the school board meetings" question (to which the obvious answer is: "actually, the bloggers are already doing it a lot as the old media has quit decades ago").

Of course, investigative journalism is just one of many forms under the rubric of 'journalism'. And, if you actually go and buy a copy of your local newspaper today (it still exists in some places, on tree-derived paper, believe me), you are likely to find exactly zero examples of investigative journalism in it. Tomorrow - the same. Every now and then one appears in the paper, and then it is often well done, but the occasions are rare and getting even more rare as investigative reporters have been cut from many a newsroom over the past few decades, and even more rapidly over the last several months.

So, what is 'Investigative Science Journalism'?

So, this train of thought brought me to the question, again, of what is 'investigative journalism' in science. And I was not perfectly happy with what I wrote about this question before. I had to think some more. But before doing all the thinking myself, I thought I'd try to see what others think. So I tweeted the question in several different ways and got a lot of interesting responses:

Me: What is, exactly, 'investigative science reporting'?

@davemunger: @BoraZ To me, it means going beyond looking at a single study to really understand a scientific concept. Diff from traditional "inv. journo"

@szvan: @davemunger @BoraZ And looking at methodology, statistical analysis, etc. to determine whether claims made match what was studied.

@LeeBillings: @BoraZ Re: "investigative science reporting," isn't it like all other investigative reporting where you dig deep and challenge your sources?

@Melhi: @BoraZ I thnk it means, "we cut/pasted from Wiki, all by ourselves." Seems to be what it means when "scientific" is removed from the term.

Me: @LeeBillings clarify: What's the story about? dig deep into what? who are the sources? why are you assuming they need to be challenged?

@soychemist: @BoraZ Any instance in which a reporter tries to uncover scientific information that has been concealed or distorted, using rigorous methods

@john_s_wilkins: @BoraZ Reporting on investigative science, no doubt.

@LeeBillings: @BoraZ ?s you're asking only make sense in context of a specific story, not in context of defining "sci investigative journalism" as a whole

@LeeBillings: @BoraZ 1/2 but typically, the goal is to find out what's true, and communicate it. you dig into primary literature & interview tons of ppl

@LeeBillings: @BoraZ 2/2 you don't assume they need to challenged. you *know* they need to be challenged based on your in-depth research into primary lit

Me: When futurity.org was released, a journo yelled "propaganda"! Does every press release need to be investigated? Challenged?

Me: Are scientists presumed to be liars unless proven otherwise? All of them?

@NerdyChristie: Usually. Unless you're studying how herbal tea makes you a supergod. RT @BoraZ: Are scientists presumed to be liars unless proven otherwise?

@szvan: @BoraZ Not liars but not inherently less open to bias than anyone else. Some wrongs are lies. Some are errors.

Me: Are journalists capable of uncovering scientific misconduct at all? All of those were uncovered by other scientists, I recall...

@lippard: @BoraZ Didn't journalist Brian Deer do the investigative work to expose Andrew Wakefield's MMR-autism data manipulation?

@JATetro: @BoraZ To be honest, there are some very good journalists out there who can spot misconduct but without backing from a source, it's liable.

Me: @BoraZ: @JATetro yes, they need scientists to do the actual investigating, then report on what scientists discovered - fraud, plagiarism etc.

@JATetro: @BoraZ So it's not the journalists fault, really. They do their job as well as possible but without our help, there's little they can do.

@LabSpaces: @JATetro @BoraZ Actual scientists cost too much.They're a luxury, and especially in these times, it's hard for pubs. to justify having 1

@JATetro: @LabSpaces @BoraZ Apparently it's hard for universities to have them as well...not a prof or anything but damn it's ridiculous.

@LabSpaces: @JATetro @BoraZ I dunno, our PR dept. does a great job interacting with scientists and getting the right info out, but I guess that's diff.

@JATetro: @LabSpaces @BoraZ Oh, the media people at the U are great. It's the administrators that seem to forget who keep the students comin'.

Me: Isn't investigating nature, via experimentation, and publishing the findings in a journal = scientific investigative reporting?

@LeeBillings: @BoraZ 1/2 I'd say that's performing peer-reviewed scientific research, not doing investigative science journalism.

@LeeBillings: @BoraZ 2/2 No room to address your ?-torrent. What are you driving at, anyway? You think sci journos can't/don't do investigative stuff?

@LouiseJJohnson: RT @BoraZ Isn't investigating nature, via experimentation, & publishing findings in a journal, scientific investigative reporting?

@mcmoots: @BoraZ "Journalism" usually means you report the results of your investigations to the public; scientists report to a technical community.

Me: @BoraZ: @mcmoots does the size and expertise of audience determine what is journalism, what is not? Is it changing these days?

Me: @BoraZ: Why is investigating words called 'investigative journalism', but investigating reality, with much more rigorous methods, is not?

@LeeBillings: @BoraZ 1 more thing: A press release isn't a story--it should inspire journos to look deeper. Sometimes that deeper look reveals PR to be BS

Me: @BoraZ: @LeeBillings Journal article is reporting findings of investigation. Press release is 2ndary. Journo article is 3tiary. Each diff audience.

@LeeBillings: @BoraZ Glad you raised ? of audience, since relevant to yr ? of "words" & "reality." Words make reality for audiences, some more than others

Me: @BoraZ: Journos investigate people, parse words. Scientists investigate nature. What is more worthy?

@lippard: @BoraZ I would say that there are instances of investigative journalism that have had more value than some instances of scientific research.

Me: @BoraZ: @lippard possible, but that is investigating the rare instances of misconduct by people, not investigating the natural reality. Science?

@john_s_wilkins: @BoraZ You're asking this of a profession that thinks it needs to "give the other side" when reporting on science, i.e., quacks

@LeeBillings: @BoraZ Twitter is useful tool, but probably not best way to interview for the story you seem to be after, as responses lack depth and nuance

@LeeBillings: @BoraZ Still looking forward to reading your resulting story, of course

Me: @BoraZ: @LeeBillings you can add longer responses on FriendFeed: http://friendfeed.com/coturnix that's what it's for

@1seahorse1: @BoraZ Do you mean that I have to be nostalgic about my ape tribe and life in caves ? :-)

@TyeArnett: @BoraZ parsing data can be as dangerous as parsing words sometimes

@ccziv: @BoraZ Do not underestimate or devalue the importance of words, ever.

This shows that different people have very different ideas what 'investigative reporting' is and have even more difficulty figuring out how that applies to science! Let's go nice and slow now, explore this a little bit more.

First, I think that what Dave meant in his first tweet -

@davemunger: @BoraZ To me, it means going beyond looking at a single study to really understand a scientific concept. Diff from traditional "inv. journo"

- is not 'investigative reporting' but 'news analysis' (again, see my attempt at classification), something akin to 'explainers' done occasionally by the mainstream media (think of This American Life on NPR and their 'Giant Pool of Money' explainer for a great recent example). It is an equivalent of a Review Article in a scientific journal, but aimed at a broader audience and not assuming existing background knowledge and expertise.

The different worlds of journalists and scientists

This discussion, as well as many similar discussions we had in the past, uncovers some interesting differences between the way journalists and scientists think about 'investigative' in the context of reporting.

Journalists, when investigating, investigate people, almost exclusively. Scientists are much more open to including other things under this rubric, as they are interested in investigating the world.

Journalists focus almost entirely on words, i.e., what people say. In other words, they are interested mainly on the process and what the words reveal as to who is winning and who is losing in some imaginary (or sometimes real) game. Scientists are interested in results of the process, obtained by any means, only one of which is through people's utterances - they are interested in investigating and uncovering the facts.

Journalists display an inordinate amount of skepticism - even deep cynicism - about anyone's honesty. Everyone's a liar unless proven not to be. Scientists, knowing themselves, knowing their colleagues, knowing the culture of science where 100% honesty and trust are the key, knowing that exposure of even the tiniest dishonesty is likely The End of a scientific career, tend to trust scientists a great deal more. On the other hand, scientists are deeply suspicious of people who do not abide by high standards of the scientific community, and The List of those who, due to track record, should be mistrusted the most is topped by - journalists.

This explains why scientists generally see Futurity.org as an interesting method of providing scientific information to the public, assuming a priori, knowing the track record of these institutions and what kind of reputation is at stake, that most or all of it will be reliable, while a journalist exclaims "propaganda".

The Question of Trust

In this light, it is very instructive to read this post by a young science journalist, and the subsequent FriendFeed discussion of it. It is difficult for people outside of science to understand who is "inside" and thus to be trusted and who is not.

Those on the "inside", the scientists, are already swimming in these waters and know instantly who is to be trusted and who not. Scientists know that Lynn Margolis was outside (untrusted) at first, inside (trusted) later and outside (untrusted) today again. Scientists know that James Lovelock or Deepak Chopra or Rupert Shaldrake are outside, always were and always will be, and are not to be trusted. Journalists can figure this out by asking, but then they need to figure out whose answer to trust! Who is inside and trusted to say who else is inside and trusted? If your first point of entry is the wrong person, all the "sources" you interview will be wackos.

Unfortunately the mistrust by journalists is often 'schematic' - not based on experience or on investigating the actual facts. They have a schema in their minds as to who is likely to lie, who is likely to use weaselly language, who can generally be trusted, etc. They use this rule-of-thumb when interviewing criminals, corrupt cops ("liars"), politicians, lawyers, CEOs ("weaselly words"), other journalists ("trustworthy") and yes, scientists ("suspicious pointy-heads with hard-to-uncover financial motives").

The automatic use of such "rule" is why so many D.C. reporters (so-called Village) did not understand (and some still do not understand) that someone who is supposed to be in the "use weaselly language" column - the politicians - should actually have been in the "lying whenever they open their mouths" column for eight years of the Bush rule (or, to be fair, the last 30 years). It did not occur to them to fact-check what Republicans said and hastily move them to the appropriate "chronic liars" category and report appropriately. They could not fathom that someone like The President would actually straight-out lie. Every sentence. Every day. Nobody likes being shown to be naive, but nobody likes being lied to either. Their need for appearance of savviness (the opposite of naive), for many of them, over-rode the need to reveal they've been lied to and fell for it ("What are you saying? Can't be possible. They are such nice guys when they pat my back at a cocktail party over in The Old Boys Club Cafe - they wouldn't lie to me!"). And many in their audience are in the same mindset - finding it impossible (as that takes courage and humility) to admit to themselves that they were so naive they fell for such lies from such high places (both the ruling party and their loyal stenographers). And we all suffered because of it.

The heavy reliance on such rules or mental schemas by journalists is often due to their self-awareness about the lack of knowledge and expertise on the topic they are covering. They just don't know who to trust, because they are not capable of uncovering the underlying facts and thus figure out for themselves who is telling the truth and who is lying (not to mention that this would require, gasp, work instead of hanging out at cocktail parties). To cover up the ignorance and make it difficult for it to be revealed by the audience, they strongly resist the calls to provide the links to more information and especially to their source documents.

Thus He Said She Said journalism is a great way for them to a) focus on words, people, process and 'horse-race' instead of facts, b) hide their ignorance of the underlying facts, c) show their savvy by "making both side angry" which, in some sick twist, they think means they are doing a good job (no, that means all readers saw through you and are disgusted by your unprofessionalism). Nowhere does that show as clearly as when they cover science.

A more systematic investigation into 'investigation'

Now that I raised everyone's ire, let me calm down again and try to use this blog post the way bloggers often do - as a way to clarify thoughts through writing. I am no expert on this topic, but I am interested, I read a lot about it, blog about it a lot, and want to hear the responses in the comments. Let me try to systematize what I think 'investigative reporting' is in general and then apply that to three specific cases: 1) a scientist investigating nature and reporting about it in a journal, 2) a journalist investigating scientists and their work and reporting about it in a media outlet, and 3) a science blogger investigating the first two and reporting how good or bad job each one of them did.

A few months ago, I defined 'investigative journalism' like this:

Investigative reporting is uncovering data and information that does not want to be uncovered.

Let's see how that works in practice.

Steps in Investigative Reporting:

1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.

2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.

3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.

4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it - the entire raw data sets or documents or transcripts) and explaining what it means.

5) That someone then sends the article to the proper venue where it undergoes an editorial process.

6) If accepted for publication, the article gets published.

7) The article gets a life of its own - people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).

Case I: Scientist

1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.

The keeper of the secret information is Nature herself. The researcher can get a hunch about the existence of hidden information in several different ways:

- delving deep into the literature, it becomes apparent that there are holes - missing information that nobody reported on yet, suggesting that nobody uncovered it yet.

- doing research and getting unexpected results points one to the fact that there is missing information needed to explain those funky results.

- going out into nature and observing something that, upon digging through the literature, one finds has not been explained yet.

- getting a photocopy of descriptions of three experiments from the last grant proposal from your PI with the message "Do this". Great method for introducing high school and undergraduate students into research, and perhaps to get a brand new Masters student started (of course, regular discussions of the progress are needed). Unfortunately, some PIs continue doing this to their PhD students and even postdocs, instead of giving them freedom of creativity.

2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.

The scientific method includes a variety of methods for wresting secret information out of Nature: observations, experiments, brute-force Big Science, natural experiments, statistics, mathematical modeling, etc. It is not easy to get this information from Nature as she resists. One has to be creative and innovative in designing tricks to get reliable data from her.

3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.

All the collected data from a series of observations/experiments are put together, statistically analyzed, visualized (which sometimes leads to additional statistical analyses as visualization may point out phenomena not readily gleaned from raw numbers) and a common theme emerges (if it doesn't - more work needs to be done).

4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it - the entire raw data sets or documents or transcripts) and explaining what it means.

There are three potential audiences for the findings of the research: experts in one's field, other scientists, and lay audience (which may include policy-makers or political-action organizations, or journalists, or teachers, or physicians, etc.).

The experts in one's field are the most important audience for most of research. The proper venue to publish for this audience is a scientific journal of a narrow scope (usually a society journal) that is read by all the experts in the same field. The article can be dense, using the technical lingo, containing all the information needed for replication and further testing of the information and should, in principle, contain all the raw data.

The scientific community as a whole as the target audience is somewhat baffling - on one hand, some of them are also experts in the field, on the other hand, all the rest are essentially lay audience. It is neither-nor. Why target scientific community as an audience then? Because the venue for this are GlamourMagz and publishing in these is good for one's career and fame. The format in which such papers are written is great for scientists in non-related disciplines - it tells a story, but it is extremely frustrating for same-field researchers as there is not sufficient detail (or data) to replicate, re-test or follow-up on the described research. Publishing this way makes you known to a lot more scientists, but tends to alienate your closests colleagues who are frustrated by the lack of information in your report.

The lay audience is an important audience for some types of research - ones that impact people's personal decisions about their health or about taking care of the environment, ones that can have impact on policy, ones that are useful to know by health care providers or science educators, or ones that are so cool (e.g., new fossils of dinosaurs or, erm...Ida) that they will excite the public about science.

Many scientists are excellent and exciting communicators and can speak directly to the audience (online on blogs/podcasts/videos or offline in public lectures or science cafes), or will gladly accept to do interviews (TV, radio, newspapers, magazines) about their findings. Those researchers who know they are not exciting communicators, or do not like to be in public, or are too busy, or have been burned by the previous interactions with the media, tend to leave the communication to lay audience to professionals - the press officers at their institutions.

While we have all screamed every now and then at some blatantly bad press releases (especially the titles imposed by the editors), there has been generally a steady, gradual improvement in their quality over the years. One of the possible explanations for this is that scientists that fall out of the pipeline as there are now so many PhDs and so few academic jobs, have started replacing English majors and j-school majors in these positions. More and more institutions now have science-trained press officers who actually understand what they are writing about. Thus, there is less hype yet more and better explanation of the results of scientific investigation. Of course, they tend to be excellent writers as well, a talent that comes with love and practice and does not necessitate a degree in English or Communications.

5) That someone then sends the article to the proper venue where it undergoes an editorial process.

The first draft of the article is usually co-written and co-edited by a number of co-authors who "peer-review" each other during the process. That draft is then (2nd peer-review) usually given to other lab-members, collaborators, friends and colleagues to elicit their opinion. Their feedback is incorporated into the improved draft which is then sent to the appropriate scientific journal where the editor sends the manuscript to anywhere between one and several experts in the field, usually kept anonymous, for the 3rd (and "official") peer-review. This may then go through two or three cycles before the reviewers are satisfied with the edits and changes and recommend to the editor that the paper be published (or not, in which case the whole process gets repeated at lesser and lesser and lesser journals...until the paper is either finally published or abandoned or self-published on a website).

6) If accepted for publication, the article gets published.

Champaign time!

Then, next morning, back to the lab - trying to uncover more information.

7) The article gets a life of its own - people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).

After Nature closely guarded her secrets for billions of years, and after intrepid investigators snatched the secret information from her over weeks, months, years or decades of hard and creative work, the information is finally made public. The publication date is the date of birth for that information, the moment when its life begins. Nobody can predict what kind of life it will have at that point. It takes years to watch it grow and develop and mature and spawn.

People download it and read it, think about it, talk about it, interact with it, blog about it and, most importantly, try to replicate, re-test and follow up on the information in order to uncover even more information.

If that is not 'investigative reporting' at its best, I don't know what is.

Case II: Science Journalist

1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.

The hidden information, in this case, is most likely to be man-made information - documents, human actions, human words. It is especially deemed worthy of investigation if some wrong-doing is suspected.

2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.

As the journalist cannot "go direct" and investigate nature directly (not having the relevant training, expertise, infrastructure, funding, manpower, equipment, etc.), the only remaining method is to investigate indirectly. The usual indirect method for journalists is to ask people - a very, very, very unreliable way of getting information.

Since investigating the facts about nature is outside the scope of expertise of journalists, they usually investigate the behavior and conduct of scientists. This is "investigative meta-science reporting". In a sense, there is not much difference between investigating potential misconduct of scientists and misconduct of any other group of people. The main difference is that the business of science is facts about the way the world works, thus knowing who got the facts right and who got the facts wrong is important and who misrepresents lies as facts is even more important.

Unfortunately, due to lack of scientific expertise, journalists find this kind of investigation very difficult - they have to rely on the statements of scientists as to the veracity of other scientists' facts or claims - something they are not in position to verify directly. If they ask the wrong person - a quack, for example - they will follow all the wrong leads.

Thus, the usual fall-back is HeSaidSheSaid model of journalism, reporting who said what, not committing to any side, not evaluating truth-claims of any side, and hoping that (also science-uneducated) audience will be able to figure it out for itself.

Since they cannot evaluate the truth-claims about Nature that scientists make, journalists have to use proxy mechanisms to uncover misconduct, e.g., discover other unseemly behaviors by the same actors, unrelated to the research itself. Thus discovering instances of lying, or financial ties, is the only way a journalist can start guessing as to who can be trusted, and then hope that the person who lies about his/her finances is also lying about facts about Nature - a correlation that is hard to prove and is actually quite unlikely except in rare instances of industry/lobby scientists-for-hire.

The actual research misconduct - fudging data, plagiarism, etc - can be uncovered only by other scientists. And they do it whenever they suspect it, and they report the findings in various ways. The traditional method of sending a letter to the editor of the journal that published the suspect paper is so ridiculously difficult that many are now pursuing other venues, be it by notifying a journalist, or going direct, on a blog, or, if the journal is enlightened (COI - see my Profile), by posting comments on the paper itself.

3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.

Once all the information is gathered in one place, any intelligent person can find patterns. Scientific expertise is not usually necessary for this step. Thus, once the journalists manages to gather all the information (the hard part), he/she is perfectly capable of figuring out the story (the easy part).

4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it - the entire raw data sets or documents or transcripts) and explaining what it means.

Journalist advantage - they tend to be good with language and writing a gripping story. If the underlying information is correct, and the conclusions are clear, and the journalist is not afraid to state clearly who is telling the truth and who is lying, the article should be good.

5) That someone then sends the article to the proper venue where it undergoes an editorial process.

The editor who comes up with titles usually screws up this step. Otherwise, especially if nobody cuts out important parts due to length limits, the article should be fine. Hopefully, the venue targets the relevant audience - either experts (who can then police their own) or general public (who can elicit pressure on powers-that-be).

6) If accepted for publication, the article gets published.

Deadline for the next story looms. Back to the grind.

7) The article gets a life of its own - people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).

Now that the information is public, people can spread it around (e-mailing to each other, linking to it on their blogs, social networks, etc.). They bring in their own knowledge and expertise and provide feedback in various venues and some are motivated to follow up and dig deeper, perhaps uncovering more information (so the cycle repeats).

Most of science journalism is, thus, not investigative journalism. Most of it is simple reporting of the findings, i.e., second-hand reporting of the investigative reporting done by scientists (Case I). Or, as science reporters are made so busy by their editors, forced to write story after story in rapid succession, stories about many different areas of science, most science reporting in the media is actually third-hand reporting: first-hand was by scientists in journals, second-hand was by press officers of the institutions, and the journalist mainly regurgitates the press releases. As in every game of Broken Telephones/Chinese Whispers , the first reporter is more reliable then the second one in line who is more dependable than the third one and so on. Thus a scientist "going direct" is likely to give a much more reliable account of the findings than the journalist reporting on it.

There are exceptions, of course. Each discussion of science journalism always brings out commenters who shout the names of well-known and highly respectes science journalists. The thing is, those people are not science reporters. They are science journalists only in the sense that 'Science Writers' is a subset of the set 'Science Journalists'. This is a subset that is very much in a privileged position - they are given freedom to write what, when, where and how they want. Thus, over many years, they develop their own expertise.

Carl Zimmer has, over the years, read so many papers, talked to so many experts, and written so many books, articles and blogposts, that he probably knows more about evolution, parasites and E.coli than biology PhDs whose focus is on other areas of biology. Eric Roston probably knows more about carbon than many chemistry PhDs. These guys are experts. And they are writers, not reporters. They do not get assignments to write many stories per week on different areas of science. They are not who I am talking about in this post at all.

Do they do investigative reporting? Sometimes they do, but they chose other venues for it. When George Will lied about climate change data in a couple of op-eds, Carl Zimmer used his blog, not the NYTimes Science section, to dig and expose the facts about the industry and political influences, about George Will's history on the issue, about cowardly response by Washington Post to the uncovering of these unpleasant facts, etc.

Rebecca Skloot did investigative journalism as well, over many years, and decided to publish the findings in a form of a book, not in a newspaper or magazine. That is not the work of a beat reporter.

Case III: Science Blogger

1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.

Bloggers are often looking for blogging materials from two distinctly different sources: the Tables of Content of scientific journals in the fields they have expertise in, and services that serve press releases (e.g., EurekAlert, ScienceDaily, etc.). They are also usually quite attuned to the mass media, i.e., they get their news online from many sources instead of reading just the local paper.

What many bloggers do and are especially good at doing is comparing the work of Case I and Case II investigative reporters. They can access and read and understand the scientific paper and directly compare it to the press releases and the media coverage (including the writings by other bloggers). Having the needed scientific expertise, they can evaluate all the sources and make a judgment on their quality.

Sometimes the research in the paper is shoddy but the media does not realize it and presents it as trustworthy. Sometimes the paper is good, but the media gets it wrong (usually in a sensationalist kind of way). Sometimes both the paper and the media get it right (which is not very exciting to blog about).

2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.

Replicating experiments and putting that on the blog is rare (but has been done). But digging through the published data and comparing that to media reports is easy when one has the necessary expertise. Consulting with colleagues, on the rare occasions when needed, is usually done privately via e-mail or publicly on places like FriendFeed or Twitter, and there is no need to include quotes in the blog post itself.

Bloggers have done investigative digging in a journalistic sense as well - uncovering unseemly behavior of people. I have gathered a few examples of investigative reporting by science bloggers before:

Whose investigative reporting led to resignation of Deutch, the Bush's NASA censor? Nick Anthis, a (then) small blogger (who also later reported on the Animal Rights demonstrations and counter-demonstrations in Oxford in great detail as well).

Who blew up the case of plagiarism in dinosaur palaenthology, the so-calles Aetogate? A bunch of bloggers.

Who blew up, skewered and castrated the PRISM, the astroturf organization designed to lobby the Senate against the NIH Open Access bill? A bunch of bloggers. The bill passed.

Remember the Tripoli 6?

Who pounced on George Will and WaPo when he trotted out the long-debunked lie about global warming? And forced them to squirm, and respond, and publish two counter-editorials? A bunch of bloggers.

Who dug up all the information, including the most incriminating key evidence against Creationists that was used at the Dover trial? A bunch of bloggers.

And so on, and so on, this was just scratching the surface with the most famous stories.

3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.

This is often a collective effort of multiple bloggers.

4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it - the entire raw data sets or documents or transcripts) and explaining what it means.

The target audience of most science blogs is lay audience, but many of the readers are themselves scientists as well.

5) That someone then sends the article to the proper venue where it undergoes an editorial process.

Most blogs are self-edited. Sending a particularly 'hot' blog post to a couple of other bloggers asking their opinion before it is posted is something that a blogger may occasionally do.

6) If accepted for publication, the article gets published.

Click "Post". That easy.

7) The article gets a life of its own - people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).

Feedback in comments usually comes in really fast! It is direct, straightforward and does not follow the usual formal kabuki dance that ensures the control and hierarchy remains intact in more official venues.

Other bloggers may respond on their own blogs (especially if they disagree) or spread the link on social networks (especially if they agree).

If many bloggers raise hell about some misconduct and persist in it over a prolonged periods of time, this sometimes forces the corporate media to pick up the hot-potato story despite the initial reluctance to do so. But this applies to all investigative reporting on blogs, not just science.

Also, bloggers are not bound by 20th century journalistic rules - thus the exposure by impersonation, what the conservative activists did to ACORN, is perfectly legitimate way of uncovering dirt in informal venues, but not legit in corporate venues.

One more point that needs to be made here. Different areas of science are different!

Biomedical science is a special case. It is huge. It has huge funding compared to other areas, yet not sufficient to feed the armies of researchers involved in it. It attracts the self-aggrandizing type disproportionately. Much is at stake: patents, contracts with pharmaceutical industry, money, fame, Nobel prizes... Thus it is extremely competitive. It also uses laboratory techniques that are universal and fast, thus it is easy to scoop and get scooped, which fosters the culture of secrecy. It suffers from CNS disease (necessity to publish in GlamourMagz like Cell, Nature and Science). It gets inordinate proportion of media (and blog) attention due to relevance to human health. All those pressures make the motivation to fudge data too strong for some of the people involved - very few, for sure, out of 10,000s involved.

On the other end of the spectrum is, for example, palaeontology. Very few people can be palaeontologists - not enough positions and not enough money. There is near-zero risk of getting scooped as everyone knows who dug what out, where, during which digging season (Aetogate, linked above, was a special case of a person using a position of power to mainly scoop powerless students). Your fossil is yours. The resources are extremely limited and so much depends on luck. Discovering a cool fossil is not easy and if you get your hands on one, you have to milk it for all it's worth. You will publish not one but a series of papers. First paper is a brief announcement of the finding with a superficial description, the second is a detailed description, the third is the phylogenetic analysis, the fourth focuses on one part of the fossil that can say something new about evolution, etc. And you hope that all of this will become well-known to the general public. The palaeo community is so small, they all already know. They will quibble forever with you over the methodology and conclusions (so many assumptions have to go into methods that analyze old, broken bones). It is the lay audience that needs to be reached, by any means necessary. Many paleontologists don't even work as university professors but are associated with museums, nature magazines, or are freelancing. The pressure to publish in GlamourMagz is there only as a means to get the attention of the media, not to impress colleagues or rise in careers.

Most of science and most scientists, on the other hand, do not belong to one of these two fields and do not work at high-pressure universities. They do science out of their own curiosity, feel no pressure to publish a lot or in GlamourMagz, do not fear scooping, are open and relaxed and have no motivation to fudge data or plagiarize. They know that the reputation with their peers - the only reputation they can hope to get - is dependent entirely on immaculate work and behavior. Why keep them suspect because two media-prominent sub-sub-disciplines sometimes produce less-than-honest behavior? Why not trust that their papers are good, their press releases correct, their blogging honest, and their personal behavior impeccable? I'd say they are presumed innocent unless proven guilty, not the other way around.

I'd like to see an equivalent of Futurity.org for state universities and small colleges. What a delightful source of cool science that would be!

Update: blogging at its best. After a couple of hit-and-run curmudgeounly comments posted early on, this post started receiving some very thoughtful and useful comments (e.g., especially one by David Dobbs) that are edifying and are helping me learn - which is the point of blogging in the first place, isn't it?

Categories

More like this

Journalists display an inordinate amount of skepticism - even deep cynicism - about anyone's honesty.

Unless the person talking to the journalist is (1) a piece-of-shit right-wing scumbag shill or (2) an anonymous government official.

Comrade PhysioProf: correct.

Also interesting to think what people are paid for to do. Scientist is paid to do the investigation and not to write/publish (may even have to pay to publish). Journalist is paid to write/publish. Scientist is values by the quality of investigation. Journalist is valued by the ability to turn in sufficient inches of text, decently readable, before the deadline. Scientist's career is dependent on the quality of investigation, as judged by the response to publication (citations, etc.). Journalist's career is dependent on ability to churn out text at superhuman speed.

The idea that scientists are somehow involved in something more lofty is bullshit. Faking in science is both prevalent and often either undetectable or unchallengable.

Look at what it took to unmask Wakefield's claims: derived from anonymous data, hiding behind medical confidentiality and legal privilege.

At least if a journalist publishes a set of facts, others are almost always capable of checking them. For example, does the interviewee exist? Did they really say this?

By Just passing (not verified) on 27 Sep 2009 #permalink

Very thoughtful and thought-provoking, Bora! I agree with most of what you say, and would like to say a special amen to your closing thought: I too hope we see new portals like Futurity that provide outlets for all the other universities too.

What nonsense this all is.

I must start by stating the obvious, which you have for some reason overlooked: There are many, many science journalists out there with science degrees - including doctoral degrees. There are quite a few science journalists who understand science, the process of science, the history of science and yes, the politics of science, from the inside out. Presumably this is an inconvenient truth for you.

But the real trouble here is this. In this age of self-righteous blogging, fewer and fewer people see the significance of neutrality when discussing a story of any kind, and it's rather frightening; the effects of this are everywhere. The first thing *any* responsible reporter OR academic should do when investigating first-hand accounts of a story is consider the source. This is simply proper procedure. To denigrate this is to be anti-academic, and I find that terrifying, coming from a presumed academic. Yikes.

Investigative science journalists must navigate the often-confusing, perennially shifting waters where politics and science collide. Of course scientists can be corrupt and can falsify data. It's not just about funding; reputations are at stake. Does that mean *all* scientists are corrupt? Well, of course not. Yet any responsible person would seek to check out someone's story and verify the facts - of course they would! What is this world coming to when any intelligent person would even question that?

And finally, I will add this. It's the height of arrogance to assume that because you are educated in one field - chronobiology - you will automatically be competent in another, whether it's journalism, 19th-century French literature or astrophysics.

The idea behind actual journalism is that it's outward- rather than inward-focused. It's about the reader rather than the writer. It's intended to give people access to knowledge that they would otherwise feel closed-off from - most often, in the case of science, because they find it intimidating or confusing and a bit of an alien world. There's no doubt that having a scientific background is beneficial in many cases. But as I said at the beginning - many journalists do these days. What's equally helpful - the part that you don't seem to want to acknowledge - is knowing how to write about it, too. Frankly, most bloggers, including scientist bloggers, just don't know how to do it and are too arrogant to think they should bother to learn.

Both art and skill are required for truly good writing and good journalism. Your post, in fact, highlights this point and throws it into sharp relief. Truth be told, I couldn't finish the whole endless post, because it's in desperate need of an editor.

Journalism, done well, involves being aware not only of those terribly important thoughts rattling around inside your own head, but also of what will best serve your readers. If you were having a conversation with someone, would you really just talk and talk and talk at them without even considering whether they were following you, without considering their reactions, without thinking about whether you were making any sense? Very likely not; one-on-one conversation keeps you honest. So do editors. But thousands upon thousands of words just to try to say that blogging's better than mainstream media, because you think so? Eh. No. Not good enough.

By Anonymous (not verified) on 28 Sep 2009 #permalink

Anonymous curmudgeon cannot read. Or just decided not to read this post, or anything linked in it.

Everything in the comment is either a) included in the post already, or b) a big point of some of the linked articles, or c) truly bad idea.

Move on...

Anonymous curmudgeon cannot read? But he's right on one thing: your post is far too long for the slim demonstration you're doing (investigating journalism is disappearing, and scientists or bloggers can replace it). And if the only people who read it from beginning to the end are the ones who agree with you, doesn't that tell us something about the future of science blogging? This, we should worry about.

I never said that "scientists or bloggers can replace it". I have never said it, because I don't believe it. And the point of this post has nothing to do with that topic at all.

It is preconceived notions of curmudgeons who come and write hit-and-run comments on posts they have not read (but this they surely must mean someone is claiming they are worthless) that make that claim, not me.

I never said that "scientists or bloggers can replace it". I have never said it, because I don't believe it. And the point of this post has nothing to do with that topic at all.

Well, Bora, believe it or not, but your post looks that way. You seem to bring your reader in this direction all along. Good for you that you didn't think so.

It may look that way for people actively seeking gravedancers. This post is a response to: a) violent response by some journalists to the mere idea of Futurity.org (note I do not make any judgments of Futurity.org myself and have inserted a number of caveats), b) idea that publishing in peer-reviewed journals is not journalism, c) equating science beat reporters with science writers, d) notions that only investigative journalism is 'real' journalism, e) notions that MSM actually does investigative journalism, f) notions that bloggers do not and cannot do investigative journalism.

There is nothing in it about bloggers "replacing" journalists (gah, what a broad and unspecific notion that does not even define its terms, New way or Old way! Which I addressed many times before, e.g., this and this).

A lot of that knee-jerk reaction by traditional journalists comes from existential fears of losing jobs. Understandable, but does not in any way help the discussion. And it always, maddeningly, leads to discussions of "business models" which I absolutely do not care about.

It may look that way for people actively seeking gravedancers.

Not only them. It may look that way for people who simply have a limited time to read this post, and hundreds of other posts, and have already preconceived ideas about you or this debate, before coming to this post.

I know, having preconceived ideas is a very bad thing. But this is also very human, and some types of communications are more efficient than others to fight preconceived ideas. Case in point: this post. Maybe your goal with this very long post was not to fight preconceived ideas. Maybe you didn't think that those journalists who are fearful of Futurity would read this post from the beginning to the end and say "oh wow, I never thought of that". But if it was one of your hopes, well, some of the points made by "Anonymous" makes some sense and you should analyse them with more attention.

As I stated I was going to "...use this blog post the way bloggers often do - as a way to clarify thoughts through writing." Free-wheeling out-loud thinking, asking for feedback. If too long to read, don't comment.

Hi Bora,

A few cents worth: Bora, I know you like to work fast, but in a few spots here you lose me. My larger take is that you have some good points about practices and perpsectives relevant to good and/or investigative journalism/writing â but they get lost in fuzzy distinctions between scientists who write, journalists, and science bloggers. There are distinctions that don't hold, generalizations that conflict with other generalizations, categories that don't hold up. You write, for instance, that journalists display an inordinate amount of skepticism about people's honesty; yet if memory serves me well, you also feel journalists are guilty of stenography (I'd agree). I think this gets at a deeper problem -- an understandable one, but one that seems to undermine much of this -- that your terms like "journalists" are so baggy that they can't serve well here to draw the distinctions and lines you're trying to. For example, Do investigative journalists focus mostly on people? I'd say they more often focus on institutions and actions (though they may use a person to reveal/tell the story). You say journalists (I take you to mean journalists in general here) focus on words and processes, not outcome; I'd argue that in fact too many science journalists (or journalists writing about science) obsess about outcomes and ultimate implications and the end results of lines of research, pushing studies constantly in that direction even if it means getting ahead of things. (And are pushed to do so by editors who want that for a story. Pitch a story about an advance in biomedical science, and usually the question will come explicitly or implicitly: Will this lead to a cure/paradigm shift/similar marked end result?) It's not that these things aren't true; it's that they are true, but so are the opposite, for journalists, like scientists, are a very mixed bunch. I appreciate you're trying to delineate major lines and main trends here. But the strokes are so broad they fail to accurately trace or outline the reality This affects your contrast between the 'investigative reporting at its best,' as done by the scientist-journalist-blogger, and the not-so-good work done by the science journalist. Again, I know you're trying to draw broad strokes; but they're too broad, there are too many exceptions. To follow your argument by the steps you lay out. First, let's look at the contrasts you draw between the scientist writing and the journalist working. Steps 1 and 2, id'ing and digging up the story: The scientist excels because of heightened instincts and knowledge; the journalists falls short because of an inability to 'go direct' and understand the science forces him or her to rely on documents, second-hand opinions, and the conduct of people; to indulge in HeSaidSheSaid; to fail to evaluate truth claims; and to not know fudge when they see it. I don't buy it. Scientists are as likely to be led by the nose as journalists and as hesitant -- perhaps more so -- to blow the whistle or pull the covers off something big. Step 3, finding patterns/significant in the information gathered: This you call even, as discerning the pattern is easy, anyone can do it. Yet we all know that many patterns and assumptions are ignored by inured insiders but perceived quickly by outsiders. Step 4, writing the thing: Advantage journo. Let's hope so! Half-kidding. Lots of scientists writing very well out there. But I won't bicker with you on this one. Step 5, sending to edit You make that part sound pretty easy. For really substantial investigative reporting, the perils at this stage for the journalist telling it straight are probably equal to that of the scientist going against the grain at established journals. Step 6, article gets published. You rightly give it a draw. Everybody gotta git back to work. Step 7, article gets life of its own (or not) I guess not a whole lot of difference there. I know what comes next: You point out that you're speaking of most journalists, not the exceptions ... Fair as far as it goes. But I still think you're painting too broadly. The exception is the journalist or the scientist that does any contrarian investigative reporting at all; forget good v bad. Investigative, truly critical inquiry is always the exception, almost by definition. At this points you draw a distinction between the science reporters and the science writers (so are both these subsets of sci journos?) -- the former being mainly stenographers and the latter being thoughtful, penetrating, informed people who can also write well. Again, good as far as it goes; but the distinction feels a little forced. And the main example you cite â Carl's delightful take-down of George Will (who should stick to baseball) â is not really investigative journalism even in the way you outlined yourself further up: Zimmer's not spying a hidden story here, but calling out a loudmouth and mounting a lovely and devastating counterattack. True, he's exercising an impulse similar to that driving investigative inquiry -- a strong bullshit detector and a desire to make sure the real story gets told. But it's not the peel-the-onion sort of thing that I think sits at the center of any solid definition of investigative journalism. And my guess is Carl would agree that you're ignoring many substantial and deeply useful examples of investigative sci journalism done by MSM people like Ben Carey and Gardiner Harris at the Times, or Shannon Brownlee and Jeanne Lenzer, who â like blogger Philip Dawdy â have done classic investigative work to pull the sheets of vast corporate, medical, and scientific chicanery. I feel the ground get a bit more shaky yet when you turn to the real heros of your piece, the science bloggeres, where it seems to me you use the overly broad strokes already applied (with the broadest being applied to "science journalists" who are then broken post-facto into reporters (bad) v writers (good), with the latter being set aside as exceptions to the rule) to falsely divide the methods used by even good investigative science journalists in their 7 steps (esp 1-3) from the methods you say are used by science bloggers. I think here too you're too heavily attributing particular methods to particular groups; when the difference is not so much between groups as between good and bad practices. Many and possibly most science bloggers are stenographers, too. Good ones are rare; really good ones even rarer (and thus a great treat). And really good ones who do real investigative work even more rare â just as in sci journos. Likewise, the methods you attribute to the Sci Blogger investigative journalists -- SBIJs, for short -- are pretty much the same as those used by the SWIJs -- the Science Writer Investigative Journalists. As so: 1. The hunch, the wiff, the tip, the tuition, the insight built from experience and knowledge. Evident in both good journos and good scientist-bloggers. 2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public. Precisely what good SWIJs do. 3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means. This is often a collective effort of multiple bloggers. Journalists do this too, sometimes collaboratively. 4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it - the entire raw data sets or documents or transcripts) and explaining what it means. This is the growing trend in both blogs, MSM, and among the journalists-formerly-known-as-MSMers. The take-home? Only a minority of the people who write about science, whether scientists, journalists, or bloggers, exercise the combination of attitude, perspective, and skills that create good investigative writing. I don't think any of these three classes enjoys a big edge here. I find myself circling back to a point you yourself have made quite a few times: The distinction between blogs and journalism is increasingly less illuminating and useful. I think that's the case here. That said, you do make useful distinctions here about practices. But I think you're too ready to attribute good practices too broadly to science bloggers and bad practices too broadly to science journalists (except to the good ones, who are the science writers, and who presumably don't count as journalists, or something ... ). As to Futurity.org: I have significant misgivings . Most university research PIOs write good, solid synposes and explanations of their institutions' research, and many have good senses of story. But it's not their business â in fact, is usually quite explicitly not part of their job â to question the assumptions behind the research, the broader contexts in which the research takes place or in which the results are used, the various and subtle types and effects of conflicts of interest, and so on. To me, those issues are vital, ignoring them gets us in all sorts of trouble, and examining them is precisely the job of the investo-geek. Anyone (rightly) leery of churnalism and press-release stenography should be leery of Futurity, which is essentially a highly organized PI office. The commenters at Metafilter get it right. I think you draw some useful distinctions about practice. But the distinctions between practitioners seem to me a bit forced. With luck, we can discuss this more thoroughly -- and doubtless even more lucidly -- over beers at SO 2010.

I don't understand why journalists don't trust scientists. Most scientists are more than happy to talk (incomprehensibly and at great length) about their research.

I do understand why scientists can occasionally be distrustful of journalists, best explained by the following comic:
http://www.phdcomics.com/comics/archive.php?comicid=1174
or by any of numerous cases of science journalism gone horribly bad.

Robert, Cham is explicitly including institutional PR offices and press releases as part of the problem, the debate over which (in relation to futurity.org) is how this post got started.

He's hedging the amount these press release actually hedge their headlines, too, Bora's insistence that they are improving notwithstanding. A quick look at Futurity (or EurekaAlert) will show a distinct dearth of "probably" and "under certain circumstances."

But this is only to say that PR and science don't have totally overlapping goals, a statement that I didn't think was even controversial, much less indicative of a "very anti-science sentiment." Is Cham guilty of that as well? Am I?

Bora, I also missed the part where Alexis' (actual quote) "university PR shouldn't get a free pass" became (your words) "scientists are never to be trusted." The first implies only neutrality, the second an explicitly adversarial stance that has never been my experience as a science journalist.

Indeed, I feel science journalists are more commonly criticized for being too trusting, re-writing press releases without actually speaking to the researchers involved. As you say, third-hand reporting.

Many a story is still killed after making those phone calls, where the scientists (usually kindly) informs the journalist that the press release about their work has overhyped or misrepresented its import. Whether this was intentional or unintentional is besides the point.

In any case, I do appreciate these posts, Bora. Certainly provocative food for thought. But I do think you may be conflating mistrust of PR with mistrust of scientists themselves.

I think if science journalists want to come forward here as the defenders of the public from hype, PR, and spin, they've got a long, long way to go to convince scientists they're capable of this. We're all in agreement that university press releases need some careful scrutiny before being reported on, but expecting journalists to provide this feels to me like expecting hedge funds to self-regulate. What I mean and perhaps what Bora was getting at was that, in my experience, far more distortion happens after the scientists have done their investigation and reporting than at any other point in the dissemination of the information.

Alexis is exactly right that "university PR shouldn't get a free pass", and to his credit he's less guilty of this than many others in his profession, but the very idea that being reported on lends credibility to a finding is one of those things that's true in theory, and may have worked in the past, but presently fails, and fails badly, in practice.

Wow, I'm totally late to the game on this one, but glad to have read this post. I found Dobbs' comments the most useful here, and I found myself agreeing with him a lot. I don't buy the assertion that scientists investigating nature are doing investigating journalism. It's a completely different process and audience, but I also think that parsing this out depends in large part upon how you define investigative journalism, and even how "journalism" itself is defined. We live in an age of mushy interdisciplinary definitions. It seems most science journalism today falls into the explanatory side, or simply explaining what was done and why it matters and what other qualified scientists think of it (putting it in context in the researcher's field). There is investigative science journalism going on, but less so than we need, especially in biomedical and pharma. And that is hardly the journalist's fault. It's the economics of the business driving publishers to lay off specialty reporters. I know that you generally don't like the established media process, but when done right it does offer something that blogging does not -- an editorial process. Editors are often left out of conversations about journalism today, but a good editor will help shape a story and keep it focused and tight. (Then again, a bad editor can totally ruin it.) Time and again, I am thankful to have a good strong edit with my work because it forces me to think about the story/issue at hand well beyond my own perspective. Thanks for provoking discussion on this, I think both fields (science and journalism) need to keep talking about this.