Do pressures to publish increase scientists' bias?

Apr 21, 2010

The quality of scientific research may be suffering because academics are being increasingly pressured to produce 'publishable' results, a new study suggests. A large analysis of papers in all disciplines shows that researchers report more "positive" results for their experiments in US states where academics publish more frequently. The results are reported in the online, open-access journal PLoS ONE on April 21st, by Daniele Fanelli, of the University of Edinburgh.

The condition of today's is commonly described by the expression "publish or perish". Their careers are increasingly evaluated based on the sheer number of papers listed in their CVs, and by the number of received - a measure of scientific quality that is hotly debated. To secure jobs and funding, therefore, researchers must publish continuously. The problem is that papers are likely to be accepted by journals and to be cited depending on the results they report.

"Scientists face an increasing , torn between the need to be accurate and objective and the need to keep their careers alive" says Fanelli, "while many studies have shown the deleterious effects of financial conflicts of interests in biomedical research, no one has looked at this much broader conflict, which might affect all fields".

Dr Fanelli analysed over 1300 papers that declared to have tested a in all disciplines, from physics to sociology, the principal author of which was based in a U.S. state. Using data from the National Science Foundation, he then verified whether the papers' conclusions were linked to the states' productivity, measured by the number of papers published on average by each academic.

Findings show that papers whose authors were based in more "productive" states were more likely to support the tested hypothesis, independent of discipline and funding availability. This suggests that scientists working in more competitive and productive environments are more likely to make their results look "positive". It remains to be established whether they do this by simply writing the papers differently or by tweaking and selecting their data.

"The outcome of an experiment depends on many factors, but the productivity of the US state of the researcher should not, in theory, be one of them," explains Fanelli "we cannot exclude that researchers in the more productive states are smarter and better equipped, and thus more successful, but this is unlikely to fully explain the marked trend observed in this study".

Positive results were less than half the total in Nevada, North Dakota and Mississippi. At the other extreme, states including Michigan, Ohio, District of Columbia and Nebraska had between 95% and 100% positive results, a rate that seems unrealistic even for the most outstanding institutions.

These conclusions could apply to all scientifically advanced countries. "Academic competition for funding and positions is increasing everywhere", says Fanelli "Policies that rely too much on cold measures of productivity might be lowering the quality of science itself".

Explore further: Engineers develop gift guide for parents

More information: Fanelli D (2010) Do Pressures to Publish Increase Scientists' Bias? An Empirical Support from US States Data. PLoS ONE 5(4): e10271. doi:10.1371/journal.pone.0010271

add to favorites email to friend print save as pdf

Related Stories

Report: U.S. R&D publications decline

Nov 27, 2006

A science editor says the U.S. share of scientific papers published worldwide in peer-reviewed science and engineering journals is declining.

How to Spot an Influential Paper Based on its Citations

Jul 04, 2009

(PhysOrg.com) -- At first it may seem that the number of citations received by a published scientific paper is directly related to that paper's quality of content. The higher the quality, the more people read ...

How many scientists fabricate and falsify research?

May 29, 2009

It's a long-standing and crucial question that, as yet, remains unanswered: just how common is scientific misconduct? In the online, open-access journal PLoS ONE, Daniele Fanelli of the University of Edinbu ...

UCSC ranked first in nation for research impact in physics

Feb 06, 2007

In a new analysis of research publications from top U.S. universities, the University of California, Santa Cruz, ranked first for the impact of its faculty in the field of physics and fifth in the field of space sciences. ...

Science retracts cloning articles

Jan 13, 2006

The journal Science has retracted two articles by discredited South Korean scientist who claimed production of a stem-cell line from a cloned human embryo.

Recommended for you

Engineers develop gift guide for parents

Nov 21, 2014

Faculty and staff in Purdue University's College of Engineering have come up with a holiday gift guide that can help engage children in engineering concepts.

Former Brown dean whose group won Nobel Prize dies

Nov 20, 2014

David Greer, a doctor who co-founded a group that won the 1985 Nobel Peace Prize for working to prevent nuclear war and who helped transform the medical school at Brown University, has died. He was 89.

User comments : 2

Adjust slider to filter visible comments by rank

Display comments: newest first

JayK
1 / 5 (2) Apr 21, 2010
Their careers are increasingly evaluated based on the sheer number of papers listed in their CVs, and by the number of citations received - a measure of scientific quality that is hotly debated.

Citation for "hotly debated"? It looks, from the publication's citations, that it relies heavily on quote mining and insinuation.
denijane
not rated yet May 04, 2010
Finally someone to discuss the conflict of interest in all fields, rather than those connected with the industry.

It's so obvious for anyone who publishes papers - if you agree with the mafia (the group sharing a vision of certain theory), you'll more likely be approved to publish in high profile journal (and approved quickly). Fact.

Just open the "Phys. Rev"-s. So many articles without significant scientific value get published based on mere agreement with theories, while so many others never get that chance because they do not agree at all, or they do not agree enough.

I don't see how this is good for science. And it should be said openly, because sooner or later, it will go out and decrease even more the public confidence in science.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.