Thursday, July 26, 2012
Record number of journals banned for boosting impact factor with self-citations
More research journals than ever are boosting their impact factors by self-citation.
Every year, Thomson Reuters, the firm that publishes the impact-factor rankings, takes action against the most extreme offenders by banning them from the latest lists. It lets them in again, suitably chastened, a couple of years later.
And this year, the apparent game playing has reached an all-time high. Thomson Reuters has excluded 51 journals from its 2011 list, published yesterday; 28 of the banned are new offenders, says Marie McVeigh, director of the firm’s annual Journal Citation Reports (JCR), and the others remain blacklisted from last year. The full list is available here for subscribers to JCR.
That’s a substantial increase on previous years: 34 journals were excluded from the 2010 lists, compared to only 26 in 2009, 20 in 2008 and just 9 in 2007.
Almost all of those banned are excluded because of excessive self-citation, although three journals — Cell Transplantation, Medical Science Monitor and The Scientific World Journal — apparently worked together to cite each other and thus raise impact factors. That “cartel” was originally reported by Phil Davis on The Scholarly Kitchen, and he has today posted a follow-up article on that ban. McVeigh says that this incident, which she calls “an anomaly in citation stacking”, is the only one of its kind that she has found.
To put all this in context, the 2011 ranking includes more than 10,500 journals, so the removed offenders make up fewer than 0.5% of the total. Still, Thomson Reuters could have kicked out more journals. Its own statistics indicate that 140 journals have had self-citations making up more than 70% of total citations in the past two years. By comparison, four-fifths of journals keep this proportion below 30%.
But McVeigh explains that some cases of disproportionate self-citation aren’t censured, if a journal’s citations are so small that it hardly affects the impact-factor rankings. “Before we suppress a journal we set thresholds extremely high, because we don’t want to get into the business of policing minor behaviour,” she says. “We only suppress a journal when we think that this numerically significantly alters the ranking category”. The firm does not ascribe motives to the anomalous citation patterns it sees, she adds.
One of the newly banned journals, the Journal of Biomolecular Structural Dynamics (JBSD), boosted its impact factor from 1.1 in 2009 to 5.0 in 2010. Because of that sharp rise, last July it was invited by Thomson Reuters to explain its success in a ‘Rising Stars’ analysis.
In his Rising Stars response, editor-in-chief Ramaswamy Sarma told the firm that his journal’s high citation rate owed to two major reasons. First, after a controversial paper on protein folding, 31 laboratories had written commentaries on the work that were published in the JBSD February 2011 issue, leading to a high number of JBSD citations. More importantly, since 2009, JBSD had encouraged authors to show how their papers related to other recent JBSD publications, to enhance the education of doctoral students reading the journal.
The Rising Stars article was published in September but removed within a week, Sarma says, after Thomson Reuters contacted him about the problem of self-citations.
When Nature contacted Sarma last July to inquire about his journal’s self-citation rate (tipped off by the blog BioChemical Matters), he added that he did not follow the Thomson Reuters impact-factor statistics, preferring instead to look at usage statistics from Google Analytics (which the journal displays on its old website — according to a June 2012 announcement, it is no longer published by Adenine Press, and has moved to Taylor & Francis).
Today, Sarma says that the journal has discontinued its policy of encouraging student education by including self-citations, and its self-citations will go back to an acceptable range. The journal wants to get back into the Thomson Reuters JCR lists, he says.
“In a time when the discipline is fragmented by so many different journals, if one wants to run a viable and useful academic journal, covering both controversy and doctoral training, then continuing self-citation is unavoidable. And you say some scientists frown on it. An easy solution is for Thomson Reuters to publish an impact factor with self-citation only, and another one without self-citation to satisfy the unhappy people,” he wrote last July — a sentiment that he still holds.
Three years ago, Thomson Reuters did start publishing impact factors with and without self-citations. But in February, Allen Whilhite, an economist at the University of Alabama in Huntsville, suggested that the firm remove journal self-citations from its impact-factor calculation altogether, to remove any incentive for editors to accrue them. His plea came after he published an article in Science reporting that one in five academics in a variety of social-science and business fields said they have been asked to pad their papers with superfluous references to get published. But, as McVeigh told me at the time, Thomson Reuters feels this would be a “fundamental change to the impact-factor calculation, to ‘respond to something that has not shown itself to be a huge problem’.”
Will the trend of ever-greater self-citation by journals continue? That perhaps depends on the importance placed on the impact factor itself — a number that many researchers feel scientists and research funders would do well to ignore.
Subscribe to Posts [Atom]