Half of biomedical research studies don't stand up to scrutiny – and what we need to do about that

Half of biomedical research studies don't stand up to scrutiny – and what we need to do about that
How much of the research in these journals could be reproduced? Credit: Tobias von der Haar, CC BY

What if I told you that half of the studies published in scientific journals today – the ones upon which news coverage of medical advances is often based – won't hold up under scrutiny? You might say I had gone mad. No one would ever tolerate that kind of waste in a field as important – and expensive, to the tune of roughly US$30 billion in federal spending per year – as biomedical research, right? After all, this is the crucial work that hunts for explanations for diseases so they can better be treated or even cured.

Wrong. The rate of what is referred to as "irreproducible research" – more on what that means in a moment – exceeds 50%, according to a recent paper. Some estimates are even higher. In one analysis, just 11% of preclinical cancer research studies could be confirmed. That means that an awful lot of "promising" results aren't very promising at all, and that a lot of researchers who could be solving critical problems based on previously published work end up just spinning their wheels.

So what gives? And how can we fix this problem?

What worms tell us about reproducibility

Although definitions of and vary somewhat, for a study to be reproducible, another researcher needs to be able to replicate it, meaning use the same data and analysis to come to the same conclusions. There are lots of reasons why a study may not pass the replication test, from flat-out errors to a failure to adequately describe the methodology used. A researcher may have forgotten about a step in the process when he wrote up the methodology, for example, counted data in the wrong category, or written the wrong code for her statistics program.

Half of biomedical research studies don't stand up to scrutiny – and what we need to do about that
Hmmm, I didn’t expect those results. Credit: www.shutterstock.com

Faking results is another reason, but it's not nearly as common as others. Out-and-out fraud like that, or suspected fraud, is the reason for a bit fewer than half of the 400-plus retractions per year. But there are something like two million papers published annually, so the vast majority of studies containing irreproducible data are never retracted. And most scientists would agree that they shouldn't be; after all, most science is overturned one way or another over time. Retraction should be reserved for the most severe cases. That doesn't mean irreproducible papers shouldn't be somehow marked, though.

Here's a fresh example of a study that turned out not to be reproducible, because the results couldn't be replicated: as Ben Goldacre relates in BuzzFeed, two economists published a massive study in 2004 claiming that a "deworm everyone" approach in Kenya "improved children's health, school performance, and school attendance," even among children several miles away who didn't get deworming pills. Endorsed by the World Health Organization, it helped set policy that affects hundreds of millions of children annually in the developing world.

But now researchers have published papers describing two failures to replicate the original findings. Many of them just didn't hold up, although some did.

That, as Goldacre explains, "is definitely problematic." But the reanalyses were possible only because the original authors "had the decency, generosity, strength of character, and intellectual confidence to let someone else peer under the bonnet" – a rare situation indeed.

The fixes

Half of biomedical research studies don't stand up to scrutiny – and what we need to do about that
A girl takes her deworming tablet. Credit: Save the Children, CC BY-NC-ND

Researchers are aware of the reproducibility problem, and some are trying to fix it. In response to alarming findings about the reproducibility of basic cancer research, a program called the Reproducibility Initiative has started providing "both a mechanism for scientists to independently replicate their findings and a reward for doing so." It's chosen 50 studies for independent validation – or not, since there's certainly a chance the initial results won't be reproducible. Those working on the project will perform the same kind of analyses that researchers did in the worm study replications. A similar effort has been ongoing in psychology, and other projects are under way in the social sciences.

All of these efforts will require scientists to share data, as the authors of the deworming study did. That has been a requirement in human studies for some years now, by many funders, and it's encouraged by many journal editors. And while it's not met 100% of the time, compliance is growing. Some basic science journals are moving to make it a requirement, too.

Half of biomedical research studies don't stand up to scrutiny – and what we need to do about that
Research data need to be an open book. Credit: Brenda Clarke, CC BY

Perhaps more important, however, is that researchers – and the public that funds many of them – realize that science is a process, and that all knowledge is provisional. "It's not just naive to expect that all research will be perfectly free from errors," writes Goldacre, "it's actively harmful." Journalists, take note.

Translated into policy, that means valuing replication efforts, which right now are essentially unfunded and hardly ever published. If we want scientists to validate others' work, we'll need to create grants to do that. That means digging up additional funding, but replicating a study costs a tiny fraction of what the original work does. Funding new studies based on those that turn out to be irreproducible…well, now that's expensive.


Explore further

Educational benefits of deworming children questioned by re-analysis of flagship study

This story is published courtesy of The Conversation (under Creative Commons-Attribution/No derivatives).
The Conversation

Citation: Half of biomedical research studies don't stand up to scrutiny – and what we need to do about that (2015, July 28) retrieved 18 August 2019 from https://phys.org/news/2015-07-biomedical-dont-scrutiny.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
15 shares

Feedback to editors

User comments

Jul 29, 2015
If 50% plus of empirical research--investigation that physically goes out in the real world--is tosh, what of philosophy, social theory and media studies that just examine the intuitions of academics about reason, society and communication? Do they do better than hard scientists--or are they 100% tosh?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more