A simple reward system could make crowds a whole lot wiser
There's a problem with the wisdom of crowds.
Market economies and democracies rely on the idea that whole populations know more about what is best for them than a small elite group. This knowledge is potentially so powerful it can even predict the future through stock markets, betting exchanges and special investment vehicles called prediction markets.
These markets allow people to trade "shares" in possible future outcomes, such as the winner of upcoming elections. Anyone with new information about the future has a financial incentive to spread it by buying these shares. Prediction markets now routinely inform bookmakers odds and are quoted in news coverage of elections alongside more traditional opinion polls.
But prediction markets are having a crisis of confidence in the abilities of the crowd. They have been systematically wrong about a series of high profile political decisions, including the UK general election of 2015, the Brexit referendum and the US presidential election of 2016.
We shouldn't expect perfect accuracy on every occasion, just as we know opinion polls are often flawed. But to be wrong so consistently about such prominent events points to possible flaws in the assumptions we make about crowd intelligence. For example, people don't always act on the information they have and so it might never become part of the crowd's decision. The dynamics of crowds and markets might also stop people from paying attention to some sources of information at all.
However, there might be a way forward. My colleagues and I have come up with a model that overcomes this problem by giving people a incentive to seek out new sources of information, and an extra reason to share it.
An important question for markets is "where do individuals get their information?" Research shows that our opinions and activities very often match those of our peers. We also tend to look for information in the most obvious places, in line with everyone else.
To give an example, if you look around on any public transport in the City of London you'll probably see people holding copies of the Financial Times. This is a problem because if everyone has the same information, the crowd is no smarter than a single individual. Studies show that having a diverse collection of opinions, especially including minority views, is crucial for creating a smart group.
So why do we tend to narrow the sources of our opinions? One reason is because we have an innate desire to imitate our peers, to behave in ways that are safe and acceptable within our community. But it may also be because of a rational, profit-seeking motivation.
We studied how theoretical profit-motivated people behave when faced with the types of rewards seen in market-like situations. To do this, we created a computer simulation of a prediction market, where people received a reward for making correct predictions. Rewards were larger when fewer people guessed the right answer, just like in a prediction market or a betting exchange.
The reward an individual received was a fixed amount divided by the number of other people who made a correct prediction. This was supposed to give people an incentive to look for right answers that other people wouldn't find. But we found that people still gravitated towards a very small subset of the available information – just like London bankers with their copies of the Financial Times.
The more complex the situation was, the smaller the percentage of available information people actually used. The problem was that the more niche, unused information, though it might be useful to the group, was so rarely useful to the individual that possessed it that there was no incentive for them to seek it out.
New reward system
To counter this, we created a theoretical new prediction market system, where people would only be rewarded if they expressed accurate views but were also in the minority. For example, if someone predicted that Donald Trump would win the US election, against the consensus view, they would have received a reward once the result was known. Conversely, if most people accurately predict the Conservative Party will win the upcoming UK election then they wouldn't receive any reward.
We found that this "minority reward" system, which explicitly favours those who go against popular opinion if they turn out to be correct, produced much more accurate collective decisions. This was especially the case when the situations were complex, influenced by many factors.
Intuitively, this makes sense. If your opinion supports the existing popular view, you can't change whether the group will be correct or not. In our model, people have an incentive to go hunting for more esoteric sources of information about possible future outcomes. For example, rather than reading the Financial Times, they might follow obscure blogs, or read local newspapers looking for information on companies in the area.
They know that only by finding information that very few have access to will they have a chance to correctly go against the prevailing wisdom. This encourages the whole group to bring together a much wider set of information, leading to more accurate collective decisions.
Our results are so far confined to a theoretical model, but they give us an insight into why current forms of prediction markets may be prone to failure, and how we might try to improve them in future. We hope that these insights will be used to create more accurate prediction markets, as we could all benefit from better collective foresight.
Better predictions and collective decision making could help society decide which political ideas will or won't work. Improving the ability of stock markets to predict which companies and ideas will do well could improve the return on investment and generate greater economic growth. Even academia is a large-scale exercise in collective wisdom. If changing the way that researchers are rewarded can improve the wisdom of this crowd, it could lead to more important scientific discoveries.