This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Are search engines bursting the filter bubble? Study finds political ideology plays bigger role than algorithms

search engine
Credit: CC0 Public Domain

Political ideology and user choice—not algorithmic curation—are the biggest drivers of engagement with partisan and unreliable news provided by Google Search, according to a study coauthored by Rutgers faculty published in the journal Nature.

The study addressed a long-standing concern that digital algorithms learn from user preferences and surface information that largely agrees with users' attitudes and biases. However, search results shown to Democrats differ little in ideology from those shown to Republicans, the researchers found. The ideological differences emerge when people decide which search results to click, or which websites to visit on their own.

Results suggest the same is true about the proportion of low-quality content shown to users. The quantity doesn't differ considerably among partisans, though some groups—particularly older participants who identify as 'strong Republicans'—are more likely to engage with it.

Katherine Ognyanova, an associate professor of communication at the Rutgers School of Communication and Information and co-author of the study, said, "Google's algorithms do sometimes generate results that are polarizing and potentially dangerous. But what our findings suggest is that Google is surfacing this content evenly among users with different political views," Ognyanova said. "To the extent that people are engaging with those websites, that's based largely on personal political outlook."

Despite the crucial role algorithms play in the news people consume, few studies have focused on —and even fewer have compared exposure (defined as the links users see in search results), follows (the links from search results people choose to visit), and engagement (all the websites that a user visits while browsing the web).

Part of the challenge has been measuring user activity. Tracking website visits requires access to people's computers, and researchers have generally relied on more theoretical approaches to speculate how algorithms affect polarization or push people into "filter bubbles" and "echo chambers" of political extremes.

To address these knowledge gaps, researchers at Rutgers, Stanford and Northeastern universities conducted a two-wave study, pairing with collected from a custom-built browser extension to measure exposure and engagement to online content during the 2018 and 2020 U.S. elections.

Researchers recruited 1,021 participants to voluntarily install the browser extension for Chrome and Firefox. The software recorded the URLs of Google Search results, as well as Google and browser histories, giving researchers precise information on the content users were engaging with, and for how long.

Participants also completed a survey and self-reported their political identification on a seven-point scale that ranged from "strong Democrat" to "strong Republican."

Results from both study waves showed that a participant's political identification did little to influence the amount of partisan and unreliable news they were exposed to on Google Search. By contrast, there was a clear relationship between political identification and engagement with polarizing content.

Platforms such as Google, Facebook and Twitter are technological black boxes: Researchers know what information goes in and can measure what comes out, but the algorithms that curate results are proprietary and rarely receive public scrutiny. Because of this, many blame the technology of these platforms for creating echo chambers and filter bubbles by systematically exposing users to content that conforms to and reinforces personal beliefs.

Ognyanova suggests the findings paint a more nuanced picture of search behavior.

"This doesn't let platforms like Google off the hook," she said. "They're still showing people information that's partisan and unreliable. But our study underscores that it is content consumers who are in the driver's seat."

More information: Ronald E. Robertson, Users choose to engage with more partisan news than they are exposed to on Google Search, Nature (2023). DOI: 10.1038/s41586-023-06078-5. www.nature.com/articles/s41586-023-06078-5

Journal information: Nature

Provided by Rutgers University

Citation: Are search engines bursting the filter bubble? Study finds political ideology plays bigger role than algorithms (2023, May 24) retrieved 23 April 2024 from https://phys.org/news/2023-05-filter-political-ideology-plays-bigger.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Fake news on Facebook increased 2020 election doubts, finds study

9 shares

Feedback to editors