I Unwittingly Manipulate A Citation Index

The list of misdemeanours that identifies an Open Access science journal as predatory and not bona fide is long. One of them is attempts on the part of the publisher and editors to manipulate the journal's citation index, for instance by demanding that authors cite earlier work published in the same journal. If many scholars cite papers in a given journal, then that journal's index improves -- even if the citing only goes on inside the covers of the journal itself.

When I first read about this criterion I was a little embarrassed, because I do that all the time when editing Fornvännen. I don't demand that authors cite earlier papers in our journal, but I often suggest that they should, because it's part of my job as editor to make sure that authors acknowledge the latest relevant work in their fields. Still, ours is not a predatory operation.

To start with, few scholars in the Scandinavian humanities pay any attention to citation indices. Ours aren't global fields of inquiry such as those covered by Nature and Science. I have no idea what Fornvännen's citation index is and I don't know how to find out. Our authors wouldn't even notice if our citation index improved due to shenanigans.

Secondly, the number of journals in our fields is tiny. We're not one of a hundred journals competing for the same papers. Thirdly, we practice green Open Access, so we don't make any money off of authors, or at all actually. And fourthly and most importantly, Fornvännen is on its 109th year of uninterrupted publication and has no need to reinforce its brand. Within the parameters of a regionally delimited field in the humanites, for us to try to manipulate our citation index would be like Science or Nature doing it.

More like this

There is a fine distinction here. Insisting that authors cite papers on the subject, which may be in your journal or some other publisher's, is reasonable. The issue is journals which push authors to cite papers specifically in that journal, not necessarily because they are relevant. The trick is to distinguish between the two cases.

By Eric Lund (not verified) on 06 Feb 2014 #permalink

I am bloody tired of the American glorification of the Citation Index. In some subfields there is little if any positive correlation between the number of journal papers that cite a paper and the number of people who may directly or indirectly make use of the paper's data or conclusions out in the real world. Valuing the former while ignoring the latter turns science into a form of mental, let's say, self-gratification. And let's not even talk about the ridiculousness of judging people's worth as scientists based upon how many citations are received by average (i.e., OTHER) authors in the journals in which they publish. That's like saying my intellectual quality should be rated higher because I once sat next to a [well-known field-specific] prize winner at dinner.

There's also the well-known sociological factor that a scholar with power over academic resources will get cited more than others.

I don't understand the point of journal ranks or citation indices either, nor of rewarding people for publishing in famous journals. These days it is easy enough to get a copy of most things that I think anything which sounds interesting and passed peer review is worth glancing at. But I know that university administrators want a way to measure research productivity, and think that just counting peer-reviewed publications is too simple.

The trouble with citation indices is that they measure certain things, which are not necessarily what you want or need to know. A citation is a citation, whether the citing author is praising or criticizing the work in question (some years ago, a senior scientist in my field confessed that his then most-cited paper got most of its citations from a rival who claimed that his paper was incorrect). But because they actually measure something, bean counters tend to use them, even for comparison between fields--a purpose for which they are worse than useless, because citation practices differ between fields, and even between subfields within a field.

By Eric Lund (not verified) on 09 Feb 2014 #permalink

Here is a link for a well-known fail at least in the field of crystallography:

www.niscair.res.in/jinfo/ALIS/ALIS 58(1) (Correspondence).pdf

What I found especially embarassing at that time is how primitive widely-used citation indices are, if they can already fail due to a single paper.

By Ulf Lorenz (not verified) on 11 Feb 2014 #permalink

Damn it, spaces should have been protected. In any case, they are part of the url. Otherwise, a google search for "impact factor acta cryst" will also turn up relevant entries.

By Ulf Lorenz (not verified) on 11 Feb 2014 #permalink