Science Intelligence and InfoPros

Little things about Scientitic Watch and Information Professionnals

Abuses of bibliometrics and the slavish adoption of Impact Factor

leave a comment »

A nice piece by editors of Reproductive Biomedicine Online:

” (…) with this easy access to databases and papers come problems: notably the increased risk of deliberate or accidental plagiarism (…) and the fact of information overload. This latter problem has resulted in what can be seen as an abreaction: the entrenchment of the ‘prestige journal’ into which a young scientist must get their paper come what may – much of the data incomprehensibly compacted, and figures often too small or cropped to be of evidential value. These ‘prestige’ journals build and thrive financially through the increased significance of a development complementary to bibliographic databases: the bibliometric analysis.

 Bibliometric analyses attempt to measure the impact of a journal’s published material that can then reinforce its prestige – or its incentive to play the ‘bibliometric boosting game’. Several such metrics are around, each with their own characteristic strengths and weaknesses. And, like the journals that these metrics claim to rank, the various metrics have acquired their own ‘prestige’ value: although on what basis, other than historical longevity, is unclear. Thus, the most sought (and feared) metric is the Impact Factor or IF (…)

A fundamental issue is that citation indices assume that if a paper is cited it is because it is useful. In reality, papers are cited for many reasons. For example, negative citations dispute results or theories by citing a paper critically. Other papers appear in citation lists simply because they have been cited previously rather than actually read – a practice facilitated by the very electronic publication that boosts the rise of metrics. This practice will tend to be self-reinforcing – squeezing out more pertinent or ‘better’ papers, and even propagating ‘myths’. Self-citation can boost one’s own IF as well as one’s own ego. We suspect that it is a rare author who could honestly claim to have generated anew each reference list – and indeed have read every paper afresh each time it is cited. (…)

All of this might not matter were it not for the recent bureaucratic obsession of institutions, funding bodies and government bodies with ‘objective metrics’for ranking ‘performance’. This obsession has led to the increasing abuse of metrics as a surrogate for the scholarly value of work. Individual students, researchers and journal editors then are pressured to collude with this value system to make metrics in general, and the IF in particular, tyrannical despots that do few of us much good and distort publishing and citation practices. (…)

The IF, despite its flaws, seems here to stay for the foreseeable future, but the range of alternative metrics described above is available to us as editors. For this reason, we have decided with our publisher that henceforth from the July issue the journal will publish our data for the following metrics: the Impact Factor, the Scimago Journal Rank, the Source Normalized Impact per Paper, the Eigenfactor and the H-index (…). We are implementing this policy to encourage critical thought and discussion about metrics and to discourage the slavish adoption of IF as the only valid way in which to assess ‘quality’.

Martin H. Johnson, Jacques Cohen , Gedis Grudzinskas. The uses and abuses of bibliometrics.  Reproductive BioMedicine Online, 2011, Vol. 24, pp. 485-486.


Written by hbasset

May 11, 2012 at 4:42 pm

Posted in Journals

Tagged with ,

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: