Call for better social science research transparency

January 2, 2014, University of California - Berkeley
Center for Open Science logo.

In the Friday (Jan. 3) edition of the journal Science, an interdisciplinary group is calling on scholars, funders, journal editors and reviewers to adopt more stringent and transparent standards to give social science research more credibility, substance and impact.

The authors, led by a University of California, Berkeley, economist, hope to change a set of practices that they contend has contributed to a distorted body of research that tends to exaggerate the effectiveness of programs that deal with important issues affecting millions of people including health, agriculture, education and environmental policy.

They cite as an example of flawed research a 2010 paper by Harvard University economists Carmen Reinhart and Kenneth Rogoff. It concluded that that when gross external debt hits 60 percent of the Gross Domestic Product, a country's annual growth declines 2 percent, and when debt exceeds 90 percent, growth is roughly cut in half. Budget hawks seized on its conclusions until other economists reviewing the work found coding errors, selective exclusion of available data, and unconventional weighting of summary statistics.

To prevent such errors from working their way into public policy making the 19 authors of the Science paper recommend key changes that include:

  • Documenting and disclosing primary information about data collection and analysis.
  • Preparing and registering pre-analysis plans to distinguish hypothesis testing from exploratory research. These written plans provide a step-by-step account of how a researcher will analyze data before they have seen the data.
  • Archiving and sharing research materials, plans and data through open channels that enable independent researchers to test and extend reported results. The materials can be made public when research is completed.

"Even if we believe that we already have enough self-correction in place, our research will be more credible if we institute these three approaches. We need to raise our game," said UC Berkeley economist Edward Miguel, the lead author of the Science article, "Promoting Transparency in Social Science Research."

"Publish or Perish"

The authors largely blame a "publish or perish" reward structure for academics, saying the odds of getting a study paper published improve if a scholar can promise novel, "theoretically tidy" or statistically significant results rather than more nuanced, mixed or perplexing findings. They also blame scholarly journals and funders for lax oversight of mistakes and confusing information.

"The program suggested by our article would not fundamentally change the nature of ," said David Laitin, a political science professor at Stanford University and a co-author of the article. "Rather, it would have small implications in the way we collect and report our data. It would also invite us to put more attention to the replication of well-known findings rather than investigations of new factors. But these small changes in our practices could have large effects in revising what we thought were well-established findings."

Miguel said the push toward greater transparency also has been influenced by increasing use of better research designs, new "big data" software tools, growing interest by governments and advocacy groups for evidence-based policy making, and public insistence on greater transparency.

Changes underway

A few institutions are already taking similar steps to those proposed in the Science article. The American Political Science Association adopted guidelines in 2012 making it an ethical obligation for researchers to back up their claims by making their data accessible and being clear about how they produce their results. Related measures have been taken by several psychological journals, the U.S. Office of Management and Budget, and the American Economic Association, with its design registry of randomized trials.

Meanwhile, the Center for Open Science has established an online collaboration tool called the Open Science Framework that enables research teams to easily register their hypotheses and pre-analysis plans, and to make public their data.

Several of the Science authors are members of the Berkeley Initiative for Transparency in the Social Sciences (BITSS), a network that launched a year ago at UC Berkeley to promote discussions and on transparency.

"Discover and disseminate the truth"

BITSS members and Science co-authors Leif Nelson of UC Berkeley's Haas School of Business, and Joseph Simmons and Uri Simonsohn of the Wharton School of the University of Pennsylvania wrote a related 2011 paper, "False-positive psychology: Undisclosed flexibility in data collection and analysis allow presenting anything as significant."

"Our goal as scientists is not to publish as many articles as we can, but to discover and disseminate truth. Many of us—and this includes the three authors of this article—often lose sight of this goal, yielding to the pressure to do whatever is justifiable to compile a set of studies that we can publish," they wrote, adding that too often researchers convince themselves that the most publishable outcome must be the best.

The Science article authors maintain that rather than stifle creativity or create undue burdens for researchers, better transparency practices will improve the scientific integrity and impact of their work.

"An is a more credible science," said Brian Nosek, a professor of psychology at the University of Virginia, a founder of the Open Science Framework and another of the Science article co-authors.

Explore further: A statistician intent on sharing research to promote better science

Related Stories

Nobel winning scientist to boycott top science journals

December 10, 2013

(Phys.org) —Randy Schekman winner (with colleagues) of the Nobel Prize this year in the Physiology or Medicine category for his work that involved describing how materials are carried to different parts of cells, has stirred ...

Flawed sting operation singles out open access journals

October 4, 2013

In a sting operation, John Bohannon, a correspondent of Science, claims to have exposed dodgy open access journals. His argument seems to be that, because of their business model, some journals are biased towards accepting ...

Recommended for you

Unprecedented study of Picasso's bronzes uncovers new details

February 17, 2018

Musee national Picasso-Paris and the Northwestern University/Art Institute of Chicago Center for Scientific Studies in the Arts (NU-ACCESS) have completed the first major material survey and study of the Musee national Picasso-Paris' ...

Humans will actually react pretty well to news of alien life

February 16, 2018

As humans reach out technologically to see if there are other life forms in the universe, one important question needs to be answered: When we make contact, how are we going to handle it? Will we feel threatened and react ...

Using Twitter to discover how language changes

February 16, 2018

Scientists at Royal Holloway, University of London, have studied more than 200 million Twitter messages to try and unravel the mystery of how language evolves and spreads.

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

davidivad
not rated yet Jan 03, 2014
I think this article makes a good point. take global warming for example. both sides of opposition seem to be jumping on new research even before the research community of that field can test it.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.