This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Most people trust accurate search results when the stakes are high, study finds

Most people trust accurate search results when the stakes are high
Rank (X-axis) does not affect the evaluation of trustworthiness (Y-axis, mean-centered) of accurate results. This lack of relationship is robust across experiments (columns) and for clicked results (top row, red) as well as non-clicked results (bottom row, blue). The trend lines represent the predicted change in trustworthiness ratings per unit decrease in rank fitted by the linear regression models. Credit: Scientific Reports (2024). DOI: 10.1038/s41598-024-61645-8

Using experiments with COVID-19 related queries, Cornell sociology and information science researchers found that in a public health emergency, most people pick out and click on accurate information.

Although higher-ranked results are clicked more often, they are not more trusted, and does not damage trust in accurate results that appear on the same page.

In fact, banners warning about misinformation decrease trust in misinformation somewhat but decrease trust in accurate information even more, according to the study, "Misinformation does not reduce trust in accurate search results, but warning banners may backfire" published in Scientific Reports on May 14.

Internet users searching for might be vulnerable to believing, incorrectly, that the rank of the search result indicates authority, said co-author Michael Macy, Distinguished Professor of Arts and Sciences in Sociology and director of the Social Dynamics Laboratory in the College of Arts and Sciences (A&S). "When COVID hit, we thought this problem was worth investigating."

The relationship between search result rank and misinformation is particularly important during a global pandemic because medical misinformation could be fatal, said Sterling Williams-Ceci '21, a doctoral student in and the paper's first author.

"Misinformation has been found to be highly ranked in audit studies of health searches, meaning accurate information inevitably gets pushed below it. So we tested whether exposure to highly ranked misinformation taints people's trust in accurate information on the page, and especially in accurate results when they are ranked below the misinformation," Williams-Ceci said.

"Our study provided hopeful evidence that people do not lose faith in everything else they see in searches when they see misinformation at the very top of the list."

Mor Naaman, professor of information science at Cornell Tech, and the Cornell Ann S. Bowers College of Computing and Information Science, also contributed to the study.

Williams-Ceci designed a series of online experiments to measure how results rank, the presence of misinformation, and the use of warning banners affect people's trust in search results related to COVID-19.

The researchers built an online interface that showed participants a page with a question about COVID-19. The researchers randomized the rank of results that contained accurate information and manipulated whether one of the top three results contained misinformation. Participants were asked to choose one result that they would click, then to rate some of the individual results they had seen on a trustworthiness scale.

The experiments showed that misinformation was highly distrusted in comparison with accurate information, even when shown at or near the top of the results list. In fact, contrary to assumptions in prior work, there was no general relationship between ' ranking on the page and how trustworthy people considered them to be.

"Misinformation was rarely clicked and highly distrusted: Only 2.6% of participants who were exposed to inaccurate results clicked on these results," the researchers wrote.

Further, the presence of misinformation, even when it showed up near the top of the results, did not cause people to distrust the they had seen below it.

Another experiment introduced warning banners on the search pages. These banners appeared at the top of the page for some participants and warned that unreliable information may be present in the results without identifying what this information said.

Google currently uses banners like these, but few studies have explored how they affect decisions about what information to trust in online searches, Williams-Ceci said.

The researchers found that one of these banners had an unanticipated backfire effect: It significantly decreased people's trust in accurate results, while failing to decrease their trust in misinformation results to the same degree.

Overall, the results assuage fears that search engines diminish peoples' in authoritative sources, such as the Centers for Disease Control and Prevention, even if these sources' information is not at the top of the page, the researchers concluded. Macy said this is among the first studies to show that combating misinformation with warning banners in search engines has mixed outcomes, potentially harmful to getting accurate results in front of .

"The backfire effect of warning labels is very alarming, and further research is needed to learn more about why the labels backfire and how misinformation can be more effectively combatted, not only on Google but on other platforms as well," Macy said.

More information: Sterling Williams-Ceci et al, Misinformation does not reduce trust in accurate search results, but warning banners may backfire, Scientific Reports (2024). DOI: 10.1038/s41598-024-61645-8

Journal information: Scientific Reports

Provided by Cornell University

Citation: Most people trust accurate search results when the stakes are high, study finds (2024, May 30) retrieved 18 June 2024 from https://phys.org/news/2024-05-people-accurate-results-stakes-high.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Online searches may reduce predisposed belief in misinformation

18 shares

Feedback to editors