Predicting the future with the wisdom of crowds

June 23, 2017 by Pamela Tom
Credit: UC Berkeley Haas School of Business

Forecasters often overestimate how good they are at predicting geopolitical events—everything from who will become the next pope to who will win the next national election in Taiwan.

But UC Berkeley Haas management professor Don Moore and a team of researchers found a new way to improve that outcome by training ordinary people to make more confident and accurate predictions over time as superforecasters.

The team, working on The Good Judgment Project, had the perfect opportunity to test its future-predicting methods during a four-year government-funded geopolitical forecasting tournament sponsored by the United States Intelligence Advanced Research Projects Activity. The tournament, which began in 2011, aimed to improve geopolitical forecasting and intelligence analysis by tapping the wisdom of the crowd. Moore's team proved so successful in the first years of the competition that it bumped the other four teams from a national competition, becoming the only funded project left in the competition.

Some of the results are published in a Management Science article "Confidence Calibration in a Multi-year Geopolitical Forecasting Competition." Moore's co-authors, who combine best practices from psychology, economics, and behavioral science, include husband and wife team Barbara Mellers and Philip Tetlock of the University of Pennsylvania, who co-lead the Good Judgment Project with Moore; along with Lyle Unger and Angela Minster of the University of Pennsylvania; Samuel A. Swift, a data scientist at investment strategy firm Betterment; Heather Yang of MIT; and Elizabeth Tenney of the University of Utah.

The study differs from previous research in overconfidence in forecasting because it examines accuracy in forecasting over time, using a huge and unique data set gathered during the tournament. That data included 494,552 forecasts by 2,860 forecasters who predicted the outcomes of hundreds of events.

Wisdom of the crowd

Study participants, a mix of scientists, researchers, academics, and other professionals, weren't experts on what they were forecasting, but were rather educated citizens who stayed current on the news.

Their training included four components:

  • Considering how often and under what circumstances a similar event to the one they were considering took place.
  • Averaging across opinions to exploit the wisdom of the crowd.
  • Using mathematical and statistical models when applicable.
  • Reviewing biases in forecasting—in particular the risk of both overconfidence and excess caution in estimating probabilities.

Over time, this group answered a total of 344 specific questions about geopolitical events. All of the questions had clear resolutions, needed to be resolved within a reasonable time frame, and had to be relatively difficult to —"tough calls," as the researchers put it. Forecasts below a 10 percent or above a 90 percent chance of occurring were deemed too easy for the forecasters.

The majority of the questions targeted a specific outcome, such as "Will the United Nations General Assembly recognize a Palestinian state by September 30, 2011?" or "Will Cardinal Peter Turkson be the next pope?"

The researchers wanted to measure whether participants considered themselves experts on questions, so they asked them to assess themselves, rating their expertise on each question on a 1–5 scale during their first year. In the second year, they placed themselves in "expertise quintiles" relative to others answering the same questions. In the final year, they indicated their confidence level from "not at all" to "extremely" per forecast.

Training: Astoundingly effective

By the end of the tournament, researchers found something surprising. On average, the group members reported that they were 65.4 percent sure that they had correctly predicted what would happen. In fact, they were correct 63.3 percent of the time, for an overall level of 2.1 percent confidence. "Our results find a remarkable balance between people's confidence and accuracy," Moore said.

In addition, as participants gathered more information, both their confidence and their accuracy improved.

In the first month of forecasting during the first year, confidence was 59 percent and accuracy was 57 percent. By the final month of the third year, confidence had increased to 76.4 percent and accuracy reached 76.1 percent.

The researchers called the training the group received "astoundingly effective."

"What made our forecasters good was not so much that they always knew what would happen, but that they had an accurate sense of how much they knew," the study concluded.

The research also broke new ground, as it is quantitative in a field that generally produces qualitative studies.

"We see potential value not only in forecasting world events for intelligence agencies and governmental policy-makers, but innumerable private organizations that must make important strategic decisions based on forecasts of future states of the world," the researchers concluded.

Explore further: Scientists show prediction polls can outdo prediction markets

More information: Don A. Moore et al. Confidence Calibration in a Multiyear Geopolitical Forecasting Competition, Management Science (2016). DOI: 10.1287/mnsc.2016.2525

Related Stories

Recommended for you

Waiting periods reduce deaths from guns, study suggests

October 17, 2017

(Phys.org)—A trio of researchers with Harvard Business School has found evidence that they claim shows gun deaths decline when states enact waiting period laws. In their paper published in Proceedings of the National Academy ...

Roman theater uncovered at base of Jerusalem's Western Wall

October 16, 2017

Israeli archaeologists on Monday announced the discovery of the first known Roman-era theater in Jerusalem's Old City, a unique structure around 1,800 years old that abuts the Western Wall and may have been built during Roman ...

Human speech, jazz and whale song

October 13, 2017

Jazz musicians riffing with each other, humans talking to each other and pods of killer whales all have interactive conversations that are remarkably similar to each other, new research reveals.

9 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Chris_Reeve
3 / 5 (2) Jun 23, 2017
This will prove important ...

"What made our forecasters good was not so much that they always knew what would happen, but that they had an accurate sense of how much they knew"

... because we have a very similar problem with scientific controversies.
Dingbone
Jun 23, 2017
This comment has been removed by a moderator.
marcelof01
not rated yet Jun 24, 2017
Geopolitical events are decided by the mindset of voters, which is decided by the "public opinion" (the media) so there's nothing otherworldly about crowds accurately predicting those events. The media also manufactures the international public opinion and this pretty much links the mindsets of people from different parts of the world, so again there's no magic there either.
Dingbone
Jun 24, 2017
This comment has been removed by a moderator.
Dingbone
Jun 24, 2017
This comment has been removed by a moderator.
Lex Talonis
not rated yet Jun 24, 2017
Jesus the reanimated Jewish zombie gives his decrees by telepathy - guides millions.

Mahomed the holy profit time travels by magic flying donkey - guides millions through fairy tales.

And Yahwee the magical diety, tells his "chosen special people" from behind the curtain, that they must murder every one who doesn't do what he tells them to do...

Yes - divine guidance fist, second or even multivarianted multihanded versions of it are truly profound.

c0y0te
not rated yet Jun 24, 2017
Why training people? Isn't machine learning perfect match for these kind of things?
Dingbone
Jun 24, 2017
This comment has been removed by a moderator.
Chris_Reeve
1 / 5 (1) Jun 25, 2017
Re: "The wisdom of crowds also said, that the evolution and global warming are bogus and that Trump will be a good president."

What the crowd determined was that Trump would be better than Hillary; that's what the question was, after all.

On global warming, the crowd clearly believes that the discipline is rife with politics. The public's sentiment was heavily influenced by the release of the ClimateGate emails, even though (and despite the fact that) the journalism and academia have consistently tried to downplay the importance of that event.

The evolution question may in fact be influenced by religion, but the crowd has likely noticed that evolution is an incomplete concept for explaining some things. And it is right, as epigenetics has also been mired in politics from its inception, and "junk" DNA is not in fact junk at all.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.