This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

written by researcher(s)

proofread

How people get sucked into misinformation rabbit holes, and how to get them out

iphone at night
Credit: Unsplash/CC0 Public Domain

As misinformation and radicalization rise, it's tempting to look for something to blame: the internet, social media personalities, sensationalized political campaigns, religion, or conspiracy theories. And once we've settled on a cause, solutions usually follow: do more fact-checking, regulate advertising, ban YouTubers deemed to have "gone too far."

However, if these strategies were the whole answer, we should already be seeing a decrease in people being drawn into fringe communities and beliefs, and less misinformation in the online environment. We're not.

In new research published in the Journal of Sociology, we and our colleagues found radicalization is a process of increasingly intense stages, and only a small number of people progress to the point where they commit violent acts.

Our work shows the misinformation radicalization process is a pathway driven by rather than the information itself—and this understanding may be a first step in finding solutions.

A feeling of control

We analyzed dozens of public statements from newspapers and online in which former radicalized people described their experiences. We identified different levels of intensity in misinformation and its online communities, associated with common recurring behaviors.

In the early stages, we found people either encountered misinformation about an anxiety-inducing topic through algorithms or friends, or they went looking for an explanation for something that gave them a "bad feeling."

Regardless, they often reported finding the same things: a new sense of certainty, a new community they could talk to, and feeling they had regained some control of their lives.

Once people reached the middle stages of our proposed radicalization pathway, we considered them to be invested in the new community, its goals, and its values.

Growing intensity

It was during these more intense stages that people began to report more negative impacts on their own lives. This could include the loss of friends and family, health issues caused by too much time spent on screens and too little sleep, and feelings of stress and paranoia. To soothe these pains, they turned again to their fringe communities for support.

Most people in our dataset didn't progress past these middle stages. However, their continued activity in these spaces kept the misinformation ecosystem alive.

When people did move further and reach the extreme final stages in our model, they were doing active harm.

In their recounting of their experiences at these high levels of intensity, individuals spoke of choosing to break ties with loved ones, participating in public acts of disruption and, in some cases, engaging in violence against other people in the name of their cause.

Once people reached this stage, it took pretty strong interventions to get them out of it. The challenge, then, is how to intervene safely and effectively when people are in the earlier stages of being drawn into a fringe community.

Respond with empathy, not shame

We have a few suggestions. For people who are still in the earlier stages, friends and trusted advisers, like a doctor or a nurse, can have a big impact by simply responding with empathy.

If a loved one starts voicing possible fringe views, like a fear of vaccines, or animosity against women or other marginalized groups, a calm response that seeks to understand the person's underlying concern can go a long way.

The worst response is one that might leave them feeling ashamed or upset. It may drive them back to their fringe community and accelerate their radicalization.

Even if the person's views intensify, maintaining your connection with them can turn you into a lifeline that will see them get out sooner rather than later.

Once people reached the middle stages, we found third-party online content—not produced by government, but regular users—could reach people without backfiring. Considering that many people in our research sample had their radicalisation instigated by , we also suggest the private companies behind such platforms should be held responsible for the effects of their automated tools on society.

By the middle stages, arguments on the basis of logic or fact are ineffective. It doesn't matter whether they are delivered by a friend, a news anchor, or a platform-affiliated fact-checking tool.

At the most extreme final stages, we found that only heavy-handed interventions worked, such as family members forcibly hospitalizing their radicalized relative, or individuals undergoing government-supported deradicalization programs.

How not to be radicalized

After all this, you might be wondering: how do you protect yourself from being radicalized?

As much of society becomes more dependent on digital technologies, we're going to get exposed to even more misinformation, and our world is likely going to get smaller through online echo chambers.

One strategy is to foster your critical thinking skills by reading long-form texts from paper books.

Another is to protect yourself from the emotional manipulation of platform algorithms by limiting your social media use to small, infrequent, purposefully-directed pockets of time.

And a third is to sustain connections with other humans, and lead a more analog life—which has other benefits as well.

So in short: log off, read a book, and spend time with people you care about.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation: How people get sucked into misinformation rabbit holes, and how to get them out (2024, February 23) retrieved 2 May 2024 from https://phys.org/news/2024-02-people-misinformation-rabbit-holes.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

How social media can spread conspiracy theories and even spark violence

31 shares

Feedback to editors