Custom Search
Thursday, July 26, 2012
Job Squeeze Vexes Postdocs
Every six months, Cara Altimus, Ph.D., asks herself: Am I making
progress? Is what I’m doing relevant? Am I likely to get a job one day?
Eighteen months into a five-year postdoctoral position in the lab of David Foster, Ph.D., assistant professor of neuroscience at Johns Hopkins University School of Medicine, Dr. Altimus has reason for concern. Postdocs are finding it harder to land academic positions, forcing them to take multiple postdocs, or leave academia for just-as-hard-to-find industry jobs.
Dr. Altimus’ research examines reactivation of previous spatial experience, believed to be a memory trace, by recording large populations of neurons in the rodent brain. She isn’t actively seeking an industry job and isn’t open to doing a second postdoc.
“I try to keep myself open to other opportunities and continue networking, so that I at least know what jobs other Ph.D.s are getting, and keeping up with these people,” Dr. Altimus told GEN.
Dr. Altimus is among thousands of biopharma postdocs uneasy about their post-postdoc futures—35,000 as of 2009, according to NSF. Data is often several years old, as NIH’s Biomedical Research Workforce Working Group found to its frustration.
In a report issued June 14, the panel called in part for NIH-funded institutions to collect data on the career outcomes of grad students and postdocs. The task force also recommended shifting NIH postdoc funding from research project grants to training grants and fellowships, and allowing institutions to provide additional training and career development while testing ways to shorten the Ph.D. training period. Funding for those and other recommendations will likely prove elusive. NIH is facing a flat budget for FY 2013. Whoever wins the presidential election will be scrambling to fulfill promises to contain spending.
“The system has strong incentives to train people, particularly when the amount of grant funding grows, because faculty have strong incentives to staff their labs with postdocs and graduate students,” Paula E. Stephan, Ph.D., professor of economics at Georgia State University and a research associate at the National Bureau of Economic Research, told GEN. “One, they’re young, and they’re full of good ideas, we think. Two, they’re very flexible, and they’re willing to work real long hours. And three, they’re cheap.”
Dr. Stephan said she’d like more research shifted to research institutes from universities: “You cannot train people without doing research, but you can do research without producing a lot more postdocs and graduate students.”
While the number of life-sci doctorates more than doubled between 1978 and 2008 (from 5,086 to 11,088), so too did the percentage of life sciences doctorates awarded to non-U.S. citizens or permanent residents (from 15.6% to 34.4%).
There’s not enough research to say how much that may have squeezed the job supply for U.S.-born postdocs, along with the move of R&D to China and India.
Cathee Johnson Phillips, executive director of the National Postdoctoral Association, said several factors explain the job squeeze: Universities are producing more Ph.D.s, more of whom are taking postdoc positions—often two or more. The number of Ph.D.s landing tenure-track jobs has held steady or dipped. And grad students traditionally shun nonacademic employment.
One nonacademic career option is Washington policy work, for agencies like NIH or FDA, or groups like AAAS. Another option is consulting. Firms like McKinsey, BCG, and Bain recruit at top-tier schools. The pay well exceeds the $41,000 median annual salary of life-sci academic postdocs, or the $47,000 median of nonacademic postdocs, according to 2008 NSF data.
“They pay you $120,000, $130,000 right off the bat,” Thihan R. Padukkavidana, Ph.D., co-founder and president of the Career Network for Science Ph.D.s at Yale, told GEN. “Why do they want these Ph.D.s? They can sell this. They say, ‘We’ve got 20 Ph.D.s working on your project.’”
Connecting postdocs with nonacademic careers is also a focus of the Johns Hopkins Postdoctoral Association’s “80/20” program—while 80% pursue academic careers, only 20% land jobs.
“What postdocs don’t realize is that by applying online, they will never get the job. It’s the networking that will get them the job. We hear about postdocs that have been applying for even three years and they haven’t heard anything back,” Maria Sevdali, Ph.D., who runs 80/20 and is a postdoctoral fellow in Hopkins’ Department of Molecular Biology and Genetics, told GEN.
Yet few postdocs, Dr. Sevdali said, attend association networking events: “Where are the postdocs? They’re not there networking. Then their contract is going to run out, and they’re going to freak out if they cannot find a job.”
Dr. Altimus, the association’s incoming co-president, said the job squeeze has lowered expectations among postdocs since she began Ph.D. studies in 2005.
“When you talk to postdocs, it’s always like, ‘If I get a job this year,’ ‘If this were published,’ and ‘I sure need to get an academic job,’” she said. “It’s not fair to say that’s just an academic problem. The whole country—the whole world—seems to be in a recession.”
By Alex Philippidis
Eighteen months into a five-year postdoctoral position in the lab of David Foster, Ph.D., assistant professor of neuroscience at Johns Hopkins University School of Medicine, Dr. Altimus has reason for concern. Postdocs are finding it harder to land academic positions, forcing them to take multiple postdocs, or leave academia for just-as-hard-to-find industry jobs.
Dr. Altimus’ research examines reactivation of previous spatial experience, believed to be a memory trace, by recording large populations of neurons in the rodent brain. She isn’t actively seeking an industry job and isn’t open to doing a second postdoc.
“I try to keep myself open to other opportunities and continue networking, so that I at least know what jobs other Ph.D.s are getting, and keeping up with these people,” Dr. Altimus told GEN.
Dr. Altimus is among thousands of biopharma postdocs uneasy about their post-postdoc futures—35,000 as of 2009, according to NSF. Data is often several years old, as NIH’s Biomedical Research Workforce Working Group found to its frustration.
In a report issued June 14, the panel called in part for NIH-funded institutions to collect data on the career outcomes of grad students and postdocs. The task force also recommended shifting NIH postdoc funding from research project grants to training grants and fellowships, and allowing institutions to provide additional training and career development while testing ways to shorten the Ph.D. training period. Funding for those and other recommendations will likely prove elusive. NIH is facing a flat budget for FY 2013. Whoever wins the presidential election will be scrambling to fulfill promises to contain spending.
“The system has strong incentives to train people, particularly when the amount of grant funding grows, because faculty have strong incentives to staff their labs with postdocs and graduate students,” Paula E. Stephan, Ph.D., professor of economics at Georgia State University and a research associate at the National Bureau of Economic Research, told GEN. “One, they’re young, and they’re full of good ideas, we think. Two, they’re very flexible, and they’re willing to work real long hours. And three, they’re cheap.”
Dr. Stephan said she’d like more research shifted to research institutes from universities: “You cannot train people without doing research, but you can do research without producing a lot more postdocs and graduate students.”
While the number of life-sci doctorates more than doubled between 1978 and 2008 (from 5,086 to 11,088), so too did the percentage of life sciences doctorates awarded to non-U.S. citizens or permanent residents (from 15.6% to 34.4%).
There’s not enough research to say how much that may have squeezed the job supply for U.S.-born postdocs, along with the move of R&D to China and India.
Cathee Johnson Phillips, executive director of the National Postdoctoral Association, said several factors explain the job squeeze: Universities are producing more Ph.D.s, more of whom are taking postdoc positions—often two or more. The number of Ph.D.s landing tenure-track jobs has held steady or dipped. And grad students traditionally shun nonacademic employment.
One nonacademic career option is Washington policy work, for agencies like NIH or FDA, or groups like AAAS. Another option is consulting. Firms like McKinsey, BCG, and Bain recruit at top-tier schools. The pay well exceeds the $41,000 median annual salary of life-sci academic postdocs, or the $47,000 median of nonacademic postdocs, according to 2008 NSF data.
“They pay you $120,000, $130,000 right off the bat,” Thihan R. Padukkavidana, Ph.D., co-founder and president of the Career Network for Science Ph.D.s at Yale, told GEN. “Why do they want these Ph.D.s? They can sell this. They say, ‘We’ve got 20 Ph.D.s working on your project.’”
Connecting postdocs with nonacademic careers is also a focus of the Johns Hopkins Postdoctoral Association’s “80/20” program—while 80% pursue academic careers, only 20% land jobs.
“What postdocs don’t realize is that by applying online, they will never get the job. It’s the networking that will get them the job. We hear about postdocs that have been applying for even three years and they haven’t heard anything back,” Maria Sevdali, Ph.D., who runs 80/20 and is a postdoctoral fellow in Hopkins’ Department of Molecular Biology and Genetics, told GEN.
Yet few postdocs, Dr. Sevdali said, attend association networking events: “Where are the postdocs? They’re not there networking. Then their contract is going to run out, and they’re going to freak out if they cannot find a job.”
Dr. Altimus, the association’s incoming co-president, said the job squeeze has lowered expectations among postdocs since she began Ph.D. studies in 2005.
“When you talk to postdocs, it’s always like, ‘If I get a job this year,’ ‘If this were published,’ and ‘I sure need to get an academic job,’” she said. “It’s not fair to say that’s just an academic problem. The whole country—the whole world—seems to be in a recession.”
By Alex Philippidis
Record number of journals banned for boosting impact factor with self-citations
29 Jun 2012
More research journals than ever are boosting their impact factors by self-citation.
Every year, Thomson Reuters, the firm that publishes the impact-factor rankings, takes action against the most extreme offenders by banning them from the latest lists. It lets them in again, suitably chastened, a couple of years later.
And this year, the apparent game playing has reached an all-time high. Thomson Reuters has excluded 51 journals from its 2011 list, published yesterday; 28 of the banned are new offenders, says Marie McVeigh, director of the firm’s annual Journal Citation Reports (JCR), and the others remain blacklisted from last year. The full list is available here for subscribers to JCR.
That’s a substantial increase on previous years: 34 journals were excluded from the 2010 lists, compared to only 26 in 2009, 20 in 2008 and just 9 in 2007.
Almost all of those banned are excluded because of excessive self-citation, although three journals — Cell Transplantation, Medical Science Monitor and The Scientific World Journal — apparently worked together to cite each other and thus raise impact factors. That “cartel” was originally reported by Phil Davis on The Scholarly Kitchen, and he has today posted a follow-up article on that ban. McVeigh says that this incident, which she calls “an anomaly in citation stacking”, is the only one of its kind that she has found.
To put all this in context, the 2011 ranking includes more than 10,500 journals, so the removed offenders make up fewer than 0.5% of the total. Still, Thomson Reuters could have kicked out more journals. Its own statistics indicate that 140 journals have had self-citations making up more than 70% of total citations in the past two years. By comparison, four-fifths of journals keep this proportion below 30%.
But McVeigh explains that some cases of disproportionate self-citation aren’t censured, if a journal’s citations are so small that it hardly affects the impact-factor rankings. “Before we suppress a journal we set thresholds extremely high, because we don’t want to get into the business of policing minor behaviour,” she says. “We only suppress a journal when we think that this numerically significantly alters the ranking category”. The firm does not ascribe motives to the anomalous citation patterns it sees, she adds.
One of the newly banned journals, the Journal of Biomolecular Structural Dynamics (JBSD), boosted its impact factor from 1.1 in 2009 to 5.0 in 2010. Because of that sharp rise, last July it was invited by Thomson Reuters to explain its success in a ‘Rising Stars’ analysis.
In his Rising Stars response, editor-in-chief Ramaswamy Sarma told the firm that his journal’s high citation rate owed to two major reasons. First, after a controversial paper on protein folding, 31 laboratories had written commentaries on the work that were published in the JBSD February 2011 issue, leading to a high number of JBSD citations. More importantly, since 2009, JBSD had encouraged authors to show how their papers related to other recent JBSD publications, to enhance the education of doctoral students reading the journal.
The Rising Stars article was published in September but removed within a week, Sarma says, after Thomson Reuters contacted him about the problem of self-citations.
When Nature contacted Sarma last July to inquire about his journal’s self-citation rate (tipped off by the blog BioChemical Matters), he added that he did not follow the Thomson Reuters impact-factor statistics, preferring instead to look at usage statistics from Google Analytics (which the journal displays on its old website — according to a June 2012 announcement, it is no longer published by Adenine Press, and has moved to Taylor & Francis).
Today, Sarma says that the journal has discontinued its policy of encouraging student education by including self-citations, and its self-citations will go back to an acceptable range. The journal wants to get back into the Thomson Reuters JCR lists, he says.
“In a time when the discipline is fragmented by so many different journals, if one wants to run a viable and useful academic journal, covering both controversy and doctoral training, then continuing self-citation is unavoidable. And you say some scientists frown on it. An easy solution is for Thomson Reuters to publish an impact factor with self-citation only, and another one without self-citation to satisfy the unhappy people,” he wrote last July — a sentiment that he still holds.
Three years ago, Thomson Reuters did start publishing impact factors with and without self-citations. But in February, Allen Whilhite, an economist at the University of Alabama in Huntsville, suggested that the firm remove journal self-citations from its impact-factor calculation altogether, to remove any incentive for editors to accrue them. His plea came after he published an article in Science reporting that one in five academics in a variety of social-science and business fields said they have been asked to pad their papers with superfluous references to get published. But, as McVeigh told me at the time, Thomson Reuters feels this would be a “fundamental change to the impact-factor calculation, to ‘respond to something that has not shown itself to be a huge problem’.”
Will the trend of ever-greater self-citation by journals continue? That perhaps depends on the importance placed on the impact factor itself — a number that many researchers feel scientists and research funders would do well to ignore.
by
More research journals than ever are boosting their impact factors by self-citation.
Every year, Thomson Reuters, the firm that publishes the impact-factor rankings, takes action against the most extreme offenders by banning them from the latest lists. It lets them in again, suitably chastened, a couple of years later.
And this year, the apparent game playing has reached an all-time high. Thomson Reuters has excluded 51 journals from its 2011 list, published yesterday; 28 of the banned are new offenders, says Marie McVeigh, director of the firm’s annual Journal Citation Reports (JCR), and the others remain blacklisted from last year. The full list is available here for subscribers to JCR.
That’s a substantial increase on previous years: 34 journals were excluded from the 2010 lists, compared to only 26 in 2009, 20 in 2008 and just 9 in 2007.
Almost all of those banned are excluded because of excessive self-citation, although three journals — Cell Transplantation, Medical Science Monitor and The Scientific World Journal — apparently worked together to cite each other and thus raise impact factors. That “cartel” was originally reported by Phil Davis on The Scholarly Kitchen, and he has today posted a follow-up article on that ban. McVeigh says that this incident, which she calls “an anomaly in citation stacking”, is the only one of its kind that she has found.
To put all this in context, the 2011 ranking includes more than 10,500 journals, so the removed offenders make up fewer than 0.5% of the total. Still, Thomson Reuters could have kicked out more journals. Its own statistics indicate that 140 journals have had self-citations making up more than 70% of total citations in the past two years. By comparison, four-fifths of journals keep this proportion below 30%.
But McVeigh explains that some cases of disproportionate self-citation aren’t censured, if a journal’s citations are so small that it hardly affects the impact-factor rankings. “Before we suppress a journal we set thresholds extremely high, because we don’t want to get into the business of policing minor behaviour,” she says. “We only suppress a journal when we think that this numerically significantly alters the ranking category”. The firm does not ascribe motives to the anomalous citation patterns it sees, she adds.
One of the newly banned journals, the Journal of Biomolecular Structural Dynamics (JBSD), boosted its impact factor from 1.1 in 2009 to 5.0 in 2010. Because of that sharp rise, last July it was invited by Thomson Reuters to explain its success in a ‘Rising Stars’ analysis.
In his Rising Stars response, editor-in-chief Ramaswamy Sarma told the firm that his journal’s high citation rate owed to two major reasons. First, after a controversial paper on protein folding, 31 laboratories had written commentaries on the work that were published in the JBSD February 2011 issue, leading to a high number of JBSD citations. More importantly, since 2009, JBSD had encouraged authors to show how their papers related to other recent JBSD publications, to enhance the education of doctoral students reading the journal.
The Rising Stars article was published in September but removed within a week, Sarma says, after Thomson Reuters contacted him about the problem of self-citations.
When Nature contacted Sarma last July to inquire about his journal’s self-citation rate (tipped off by the blog BioChemical Matters), he added that he did not follow the Thomson Reuters impact-factor statistics, preferring instead to look at usage statistics from Google Analytics (which the journal displays on its old website — according to a June 2012 announcement, it is no longer published by Adenine Press, and has moved to Taylor & Francis).
Today, Sarma says that the journal has discontinued its policy of encouraging student education by including self-citations, and its self-citations will go back to an acceptable range. The journal wants to get back into the Thomson Reuters JCR lists, he says.
“In a time when the discipline is fragmented by so many different journals, if one wants to run a viable and useful academic journal, covering both controversy and doctoral training, then continuing self-citation is unavoidable. And you say some scientists frown on it. An easy solution is for Thomson Reuters to publish an impact factor with self-citation only, and another one without self-citation to satisfy the unhappy people,” he wrote last July — a sentiment that he still holds.
Three years ago, Thomson Reuters did start publishing impact factors with and without self-citations. But in February, Allen Whilhite, an economist at the University of Alabama in Huntsville, suggested that the firm remove journal self-citations from its impact-factor calculation altogether, to remove any incentive for editors to accrue them. His plea came after he published an article in Science reporting that one in five academics in a variety of social-science and business fields said they have been asked to pad their papers with superfluous references to get published. But, as McVeigh told me at the time, Thomson Reuters feels this would be a “fundamental change to the impact-factor calculation, to ‘respond to something that has not shown itself to be a huge problem’.”
Will the trend of ever-greater self-citation by journals continue? That perhaps depends on the importance placed on the impact factor itself — a number that many researchers feel scientists and research funders would do well to ignore.
by
Wednesday, July 18, 2012
Impact Factors for journals published by Nature Publishing Group
2011 Impact Factors – released June 2012
At NPG we are committed to serving the needs of scientists and their science. We do this best by selecting and communicating the most important and valuable scientific information to the broadest possible audience. The 2011 Impact Factors reflect NPG’s success at doing this, and the exceptional authors and referees that we are privileged to work with. For a summary, please read our press release.
The table lists the 2011 Impact Factor and category ranks for journals published by NPG. Data is taken from the 2011 Journal Citation Report, Science Edition (Thomson Reuters, 2012).
A number of journals are listed in more than one category in the Journal Citation Report. In these cases, the category in which the journal has highest rank is listed.
At NPG we are committed to serving the needs of scientists and their science. We do this best by selecting and communicating the most important and valuable scientific information to the broadest possible audience. The 2011 Impact Factors reflect NPG’s success at doing this, and the exceptional authors and referees that we are privileged to work with. For a summary, please read our press release.
The table lists the 2011 Impact Factor and category ranks for journals published by NPG. Data is taken from the 2011 Journal Citation Report, Science Edition (Thomson Reuters, 2012).
A number of journals are listed in more than one category in the Journal Citation Report. In these cases, the category in which the journal has highest rank is listed.
Subscribe to Posts [Atom]