Framework provides new tools to systematically build cooperation; scientists extend theory of repeated games
All it took for rewriting the rules of understanding evolution of cooperation was a series of chance encounters between Martin Nowak, Krishnendu Chatterjee and Christian Hilbe.
Chatterjee, a computer science professor at IST Austria, mentioned the idea of stochastic games—games which can change based on players' actions—during his first visit to Harvard in 2008, and the idea sent Nowak down a years-long path to merge the concept with evolutionary dynamics.
"People who study evolution of cooperation do not use stochastic games," said Nowak, who developed the new framework in collaboration with Chatterjee, Hilbe, a post-doctoral fellow in Chatterjee's group at IST, and Stepan Simsa of the Charles University in Prague. "Instead in a sequence of repeated encounters, it is assumed that the same game with the same payoff matrix is played again and again. In a stochastic game, the game itself can change probabilistically depending on the player's actions."
That new approach, described in a July 4 paper published in Nature, describes a system that can model the evolution of cooperation based on repeated stochastic games. In the simplest form only a pair of games is used—one with a larger reward and the other with a smaller reward—in which players make a choice of whether to cooperate or to defect.
Both games work in the same way: If one player defects while the other cooperates, the defector collects a larger reward, while the other person gets nothing. If both defect, both receive a reward, albeit one that is smaller than the reward for cooperation. Under normal conditions, cooperation rarely emerges in such games, because the most logical response is for players to defect in an effort to maximize their reward.
The innovation is that whether or not players cooperate affects which game they subsequently play.
At the start, the players begin with the higher-value game. As long as both players cooperate, they continue to play that game, but a defection from either player results in them moving to the lower-value game. Once both players again cooperate, they might return to the higher-value game.
"Amazingly even if both games are set up so that cooperation does not evolve, when we put them together, we get cooperation," Nowak said. "It's almost like a paradox."
The key to making the system work, Nowak said, is the difference in value between the two games.
"If we defect, we are destroying something, but if we cooperate we are building something," he said. "So when we cooperate, we play subsequently for something that is more valuable, and if we defect, we play subsequently for something that is less valuable.
"That's what makes the new approach exciting," Nowak added. "The idea is so simple and yet it changes everything. If you defect in the first game, you lose out twice, because your opponent will retaliate and you have to play a less valuable game."
The stochastic framework can also be applied to multi-player games, including public goods games and there resolve what is known as the "Tragedy of the commons".
"Humans are exploiting the environment in a public goods game," Nowak said. "In the old framework, we decide to cooperate or defect, but the next day we play the same game again, and the state of the environment is always the same. But in our new theory, if we exploit the environment badly, in the next round it may be deteriorated, and then we face a less valuable public good," he added.
If the environment deteriorates sharply in response to defection, then there is a strong incentive to maintain cooperation. If the environment deteriorates slowly or not at all, then achieving cooperation is—paradoxically -more difficult.
"This has interesting implications for some of the major problems humans are facing, such as climate change, environmental destruction and migration" Nowak said. "If people understand that defection today means that we will play a game with a lower payoff tomorrow, then cooperation becomes a winning strategy."
If we realize that we are at a breaking point, there is a strong rational to cooperate.
The concept can also be used by public officials and policy makers to design programs that empower cooperation. If players cooperate they have the chance to move to more valuable games.
More information: Christian Hilbe et al, Evolution of cooperation in stochastic games, Nature (2018). DOI: 10.1038/s41586-018-0277-x
Journal information: Nature
Provided by Harvard University