This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

People put greater trust in news that leads them to be more politically extreme, says study

fake news
Credit: Pixabay/CC0 Public Domain

People not only think political news is likelier to be true if it reinforces their ideological biases, but will tend to trust news more if it leads them to adopt more extreme (and even incorrect) beliefs, finds a new study by a UCL researcher.

The study, published in American Economic Journal: Microeconomics, found that when people were presented with new information on politically sensitive topics, individuals on both sides of the political spectrum struggled to detect whether the information was true or not, and were biased towards trusting news that aligned with their political beliefs.

Moreover, it found also that when given news that could plausibly be true or false, people trusted news that drove them to be even more extreme than they already were, which could lead to greater political polarization.

Study author Dr. Michael Thaler (UCL Economics) said, "In situations where people are uncertain about whether news is true or not, they often decide its veracity based on whether they want it to be true rather than whether it is actually true, driven by a bias called motivated reasoning. In the context of political beliefs, motivated reasoning leads people to disagree not just on policies or interpretation, but on basic facts about the world."

To test how much people's political beliefs affected their perception of news veracity, Dr. Thaler devised an experiment to test how people interpreted new information based on how they responded to a series of factual questions. He recruited an online sample of 1,300 people in the United States representing a range of political beliefs.

He asked them a series of questions with numerical answers about like "By what percent did the murder rate go up or down during Obama's presidency?" These types of questions were chosen because it's expected that how people answered would reflect their deeply held political beliefs. For the above question, those with pro-Republican beliefs tended to say that there was an increase in the murder rate, while those with pro-Democrat beliefs said the opposite.

After they answered, Dr. Thaler then presented participants with a new piece of "information" related to the question that could either be true or false. The information presented was very simple, only stating whether their initial numerical answers were too high or too low. Following that, he then asked the participants to predict if that new piece of information they were given was true or false.

After running the test, Dr. Thaler not only found that Democrats and Republicans disagreed about the answers to these questions, but that people were nine percentage points more likely to say that news was true if it made their answers more politically extreme than their initial answer, even though this news was less likely to be true. In the above example, Democrats were more likely to overly trust information that drove them to further underestimate the murder rate after Obama's presidency, while Republicans tended to overly trust information that led them to further overestimate it.

Dr. Thaler said, "The participants trusted 'fake news' that reinforced and exacerbated their biases more than 'true news' that brought them closer to the correct answer, even though they had monetary incentives to get the answer right. This tendency shows that people are prone to take up even more extreme and polarized positions if given the opportunity."

The results suggest that when news veracity is ambiguous, people assess unfamiliar as more likely to be true if what they read aligns with their preexisting political beliefs. Moreover, when assessing this ambiguity, they are more likely to move towards more extreme positions that align with their political beliefs.

Dr. Thaler found that a wide array of politically sensitive topics prompted this motivated reasoning, including immigration, income mobility, crime, , gender, climate change, and gun laws. He found also that these effects were true across demographics, including gender, age, education and religious affiliation.

Dr. Thaler said, "One of the more surprising findings from my study is that it's not just that people are more inclined to believe false information that they want to believe, they also tend to want to go even further. Something is tethering them to the center, but if you give them the flexibility to interpret news as 'true' or 'fake,' they tend to move even further to the extremes."

More information: Michael Thaler, The Fake News Effect: Experimentally Identifying Motivated Reasoning Using Trust in News, American Economic Journal: Microeconomics (2024). DOI: 10.1257/mic.20220146

Citation: People put greater trust in news that leads them to be more politically extreme, says study (2024, April 29) retrieved 16 May 2024 from https://phys.org/news/2024-04-people-greater-news-politically-extreme.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Regardless of age and politics, people who endorse lies are aware they could be made up, say researchers

7 shares

Feedback to editors