Beware those scientific studies—most are wrong, researcher warns

July 5, 2018 by Ivan Couronne
Seafood is one of many food types that have been linked with lower cancer risks

A few years ago, two researchers took the 50 most-used ingredients in a cook book and studied how many had been linked with a cancer risk or benefit, based on a variety of studies published in scientific journals.

The result? Forty out of 50, including salt, flour, parsley and sugar. "Is everything we eat associated with cancer?" the wondered in a 2013 article based on their findings.

Their investigation touched on a known but persistent problem in the research world: too few studies have large enough samples to support generalized conclusions.

But pressure on researchers, competition between journals and the media's insatiable appetite for new studies announcing revolutionary breakthroughs has meant such articles continue to be published.

"The majority of papers that get published, even in serious journals, are pretty sloppy," said John Ioannidis, professor of medicine at Stanford University, who specializes in the study of scientific studies.

This sworn enemy of bad research published a widely cited article in 2005 entitled: "Why Most Published Research Findings Are False."

Since then, he says, only limited progress has been made.

Some journals now insist that authors pre-register their research protocol and supply their raw data, which makes it harder for researchers to manipulate findings in order to reach a certain conclusion. It also allows other to verify or replicate their studies.

Because when studies are replicated, they rarely come up with the same results. Only a third of the 100 studies published in three top psychology journals could be successfully replicated in a large 2015 test.

Medicine, epidemiology, population science and nutritional studies fare no better, Ioannidis said, when attempts are made to replicate them.

"Across biomedical science and beyond, scientists do not get trained sufficiently on statistics and on methodology," Ioannidis said.

Too many studies are based solely on a few individuals, making it difficult to draw wider conclusions because the samplings have so little hope of being representative.

The wine museum in Bolgheri, Italy: a famous 2013 study on the benefits of the Mediterranean diet against heart disease had to be retracted in June because not all participants were picked at random

Coffee and Red Wine

"Diet is one of the most horrible areas of biomedical investigation," professor Ioannidis added—and not just due to conflicts of interest with various food industries.

"Measuring diet is extremely difficult," he stressed. How can we precisely quantify what people eat?

In this field, researchers often go in wild search of correlations within huge databases, without so much as a starting hypothesis.

Even when the methodology is good, with the gold standard being a study where participants are chosen at random, the execution can fall short.

A famous 2013 study on the benefits of the Mediterranean diet against heart disease had to be retracted in June by the most prestigious of medical journals, the New England Journal of Medicine, because not all participants were randomly recruited; the results have been revised downwards.

So what should we take away from the flood of studies published every day?

Ioannidis recommends asking the following questions: is this something that has been seen just once, or in multiple studies? Is it a small or a large study? Is this a randomized experiment? Who funded it? Are the researchers transparent?

These precautions are fundamental in medicine, where bad studies have contributed to the adoption of treatments that are at best ineffective, and at worst harmful.

In their book "Ending Medical Reversal," Vinayak Prasad and Adam Cifu offer terrifying examples of practices adopted on the basis of studies that went on to be invalidated, such as opening a brain artery with stents to reduce the risk of a new stroke.

Studies regularly single out the consumption of red wine as either a cancer risk—or a way to fend off the disease

It was only after 10 years that a robust, randomized study showed that the practice actually increased the risk of stroke.

The solution lies in the collective tightening of standards by all players in the research world, not just journals but also universities, public funding agencies. But these institutions all operate in competitive environments.

"The incentives for everyone in the system are pointed in the wrong direction," Ivan Oransky, co-founder of Retraction Watch, which covers the withdrawal of scientific articles, tells AFP. "We try to encourage a culture, an atmosphere where you are rewarded for being transparent."

The problem also comes from the media, which according to Oransky needs to better explain the uncertainties inherent in scientific research, and resist sensationalism.

"We're talking mostly about the endless terrible studies on coffee, chocolate and red wine," he said.

"Why are we still writing about those? We have to stop with that."

Explore further: Science Says: What happens when researchers make mistakes

Related Stories

New guidelines issued for reporting of genetic risk research

March 28, 2011

(PhysOrg.com) -- Apples to Apples is more than just a popular card game. It’s an important concept when comparing the results of published scientific studies. It’s impossible to draw accurate conclusions, for example, ...

Recommended for you

Searching for new bridge forms that can span further

September 19, 2018

Newly identified bridge forms could enable significantly longer bridge spans to be achieved in the future, potentially making a crossing over the Strait of Gibraltar, from the Iberian Peninsula to Morocco, feasible.

Fossils reveal diverse Mesozoic pollinating lacewings

September 17, 2018

Insect pollination played an important role in the evolution of angiosperms. Little is known, however, about ancient pollination insects and their niche diversity during the pre-angiosperm period due to the rarity of fossil ...

2 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Squirrel
1 / 5 (1) Jul 06, 2018
If hard science research is dodgy, what of qualitative social science and all the "isms" of humanity department studies? John Ioannidis by showing that real attempts to do empirical research fail so often surely implies most social studies (that lack even the constraint of seeking facts) is what--acadenuc mumbo jumbo? Pseudoscience?
zz5555
5 / 5 (4) Jul 06, 2018
This is talking about medical science, which is generally not considered one of the hard sciences. I'm not aware of a study indicating similar problems in the hard sciences. I do agree, though, that social sciences likely have as many problems as the medical sciences.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.