The extremist watchers: How a network of researchers is searching for the next hate-fueled attack

terror attack
Credit: CC0 Public Domain

One evening in early June, a week after 19 students and two teachers were shot and killed at Robb Elementary School in Uvalde, Texas, extremism researcher Kesa White started seeing media reports of another mass shooting, this time at a medical center in Tulsa, Oklahoma.

In turn, she picked up each of the three phones she uses for research and got to work, scrolling fast. Apps. News sites. Twitter, and other more obscure social media platforms. In the minutes after a shooting, she would need to learn as much as she could.

The news of another attack, so soon after the racist attack in Buffalo and then the Uvalde shooting, was shocking, as always. But for White it also crystallized another emotion: dread. This, she feared, could be the one she had been waiting for.

White had spent the last week watching for what extremism researchers call a "copycat" shooting: that another disturbed young man would try and kill more people than the 18-year-old shooter had in Texas. Her research showed her all the signs: Increasing rhetoric and violent chatter on encrypted messaging apps; a blooming of support for the Uvalde shooter in dark spaces online; loose cannons increasingly firing shots across cyberspace.

"It made me think, somebody's going to try to do something similar pretty soon," White said. "I think there's going to be some sort of copycat attack."

Before long, details started to emerge about the Tulsa shooter: His race and name were revealed, and researchers and journalists concluded that the attack had not been ideologically driven. Tragic, but not a copycat.

But White's sense of dread never truly goes away. It is a part of the job for people like her, an informal team of researchers, academics and professional intelligence-gatherers who have taken it upon themselves to monitor the poisonous melting pot of American extremism.

Connected by Twitter and encrypted messaging apps, these researchers scour the nastiest corners of the internet, watching for new trends, tricks and terminology. Some create "sock puppet" accounts to get inside secret chat rooms and eavesdrop on hateful groups. Others pore over data and messages scraped from social media or hacked from extremist groups, searching for clues and the identities of modern racists and bigots.

They are a diverse collection of personalities, but on the whole, they shun publicity, balking at the suggestion their work is exciting or dramatic, and sometimes rejecting the idea that they are "hunting" for extremists. What unites them is the desire to do thankless, often boring, research they hope will shed more light on the country's dark underbelly.

"It's kind of embarrassing, the whole 'Extremism Hunter,' 'Antifa's Secret Weapon' stuff," said Megan Squire, a research fellow with the Southern Poverty Law Center, who was bestowed the "Secret Weapon" moniker in a 2018 WIRED Magazine article. "It feels like people want you to be more amazing or more spy-like or something, when what it actually is is just a lot of deliberative plodding—writing stuff down and being super-organized."

Through this methodical work, this network of extremism-watchers has become an invaluable resource for journalists, law enforcement agencies and the general public. Their sleuthing is responsible for much of what eventually becomes known about mass shooters, extremist groups and other domestic terrorists. Their monitoring occasionally sparks investigations and arrests, and their willingness to put themselves online to face harassment or worse from extremists fills a vital gap in the nation's understanding of a growing threat, said Daryl Johnson, a security consultant and former senior analyst for domestic terrorism at the Department of Homeland Security.

"I'm glad that network exists," Johnson said. "I'm glad it's been expanded. I'm glad there's more and more analysts and resources being brought to bear on this problem, because when we have more people looking at it, you get answers, and then the picture becomes clearer."

How it begins: 'Trying to save the world'

Sara Aniano, a far-right researcher who recently completed a Master's thesis at Monmouth University on Instagram comments in the lead-up to the Jan. 6 insurrection, began focusing on extremism during the COVID-19 pandemic.

Like many people, Aniano was furloughed from her job and found herself with a lot of free time.

"One day I was on the beach, and a friend of mine texted our group chat, talking about how Ellen DeGeneres was on house arrest, and adrenochrome, and 'save the children,' and I was like, 'Hold on, is she joking? Surely she's joking," Aniano said.

Aniano's friend, like millions of Americans, had fallen into the trap of nonsense and disinformation sold by people like Alex Jones and spread on fake like Infowars and by some Fox News anchors. That someone so close to her was spreading QAnon-related conspiracy theories was a wake-up-call, she said.

Almost overnight, Aniano said, she started digging into disinformation networks and conspiracy theories on Instagram and other platforms. Before long, she realized she could apply what she was learning to her Master's thesis, and within a year she was in regular contact with other extremism and disinformation researchers, sharing what she knew, searching for new leads, and falling down ever-more complicated online rabbit holes.

"I'm basically always online," Aniano said. "It looks like I'm just on my phone, but I'm actually trying to save the world."

Several researchers who spoke with USA TODAY described similar personal experiences that led them to full-time roles monitoring extremists. For Squire, it was tracking and reporting a local neo-Confederate hate group in her home town in North Carolina.

For White, who now works at the Polarization and Extremism Research Innovation Lab, or PERIL, at American University, it was a confrontation with a racist hate crime on her university campus, where somebody scrawled the initials of a predominantly Black sorority onto bananas and tied them up with string made to look like nooses.

Aniano, who was recently hired by the Anti-Defamation League to continue her work, said she eventually grew apart from her friend, who fell deeper into disinformation. But she and other researchers made the decision to run straight at a phenomenon that was unfolding across America after the election of Donald Trump. As domestic extremist groups flourished and hate crimes spiked in the late twenty-teens, an informal network of individuals determined to understand, monitor and chronicle that movement was also flourishing.

How they do it: Building a puzzle one piece at a time

A typical day for an extremism watcher often involves hours of scrolling through hateful content online.

As extremists have been pushed off mainstream social media sites like Facebook and Twitter for violating their terms of service, these experts constantly readjust their tracking to new platforms, networks and messaging services.

A lot of their time is now spent on the encrypted messaging and social media app Telegram. Often dubbed "Terrorgram" by researchers, the app, founded by a 37-year-old Russian billionaire, with its laissez-faire attitude towards extremists, has become the go-to communication platform for many extremist groups and conspiracy mongers.

Researchers monitor groups' public and private "channels" on Telegram, where users cross-post content from different channels into their own, creating a daisy-chain of hate that is trackable across the platform.

White describes her often-monotonous work as like trying to piece together a never-ending jigsaw puzzle, without knowing what the image she's ultimately building will be. She said she's essentially just watching and learning every day, trying to keep up with the latest hateful language and memes, learning about up-and-coming hate groups and new conspiracy theories.

"I'm just, like, falling into rabbit holes all day, every day," White said. "Because you're always learning something new, and what I learned today is going to be different from what I learn tomorrow, and sometimes something that I learned yesterday is no longer relevant."

Along with the day-to-day monitoring, there are also periods of frantic action.

In the hours after a mass shooting, for example, researchers scramble to learn as much as possible about a suspect before the person's online life disappears.

"You're racing against the clock to collect as much information as you can get before social media companies remove it," White said. "It's going to help paint a story of them, because you always have that one person saying, 'But they were such a nice kid, we didn't see the warning signs.' But you go on their social media account and you see them posing with guns and saying racist things online."

How they specialize: Working together

Several of the researchers have formed sub-groups that focus on a particular topic, group or conspiracy theory. The Q Origins Project, for example, is a small collective of researchers focusing primarily on the early days of the QAnon conspiracy theory and the ensuing community that grew from it.

Another collective, the Accelerationism Research Consortium, focuses on the white supremacist concept of accelerationism—the concept of seeking to foment a race war and ensuing dystopia to bring about a race-based new global order.

The researchers are always trying to understand the relationships between these groups, said Alex Mendela, an extremism researcher and member of the Q Origins Project.

"Our work focuses on how QAnon relates to the broader conspiratorial far-right, and the pathways that that individuals could take to more programmatic extremist movements and eventually violence," Mendela said.

Adding to the complexity is the ever-growing network of websites and social media platforms dedicated to hosting extremists. Researchers who spoke with USA TODAY said they monitor accounts on Gab, Gettr, Parler, DLive, Rumble, Cozy and former President Donald Trump's social media site Truth Social, to name just a few.

This process of watching and learning is a big part of an extremism researcher's job. And occasionally that work surfaces real, actionable leads, or what consider "credible threats" of violence in the real world and not just online.

And that's where the watchers sometimes become more than just passive observers.

How they respond: To report or not report?

One day in August 2019, Squire was folding laundry while "flipping through Telegram channels" when she noticed one conversation in a channel that was more stark than the typical flow of hate.

"One White man with a gun walked into two mosques and killed 50 invaders, Another walked into a mall and killed 20. Another walked into a church and ended more," wrote a user named "Anti-Kosmik 2182," who Squire had already identified as Jarrett William Smith, an ex-soldier formerly stationed in Kansas. "Have you not seen the impact of 3 amateur shootings?"

"I thought, 'I'm gonna make a copy of this real quick, because this doesn't seem right and also, this guy is using his real photo as his profile picture. I thought, well, that's kind of unusual," Squire said. "So I made a copy of the chat."

Researchers like Squire sometimes face ethical dilemmas when they come across this sort of information: Should they report these individuals to law enforcement? Or just keep monitoring them until they announce actual plans of violence?

In this case, Squire didn't have long to debate. "About 14 days later, the guy was arrested for plotting to bomb some houses," she said.

Smith was later sentenced to two and half years in prison for distributing information on about building a bomb.

The researchers who spoke to USA TODAY were split on the question of whether they should act as a conduit to the police. Some said they are quick to report open threats as soon as possible. Others said they see their work more as journalism: Watching threats and writing about them, but not contacting the cops directly.

Most full-time extremism researchers are aligned with institutions of higher education, which have their own rules about ethical responsibility and invasion of privacy.

"There are boundaries on what we can and can't do, and I think we follow a very ethical standard," said Matt Kriner, a senior researcher at the Center on Terrorism, Extremism, and Counterterrorism at the Middlebury Institute of International Studies at Monterey. "If it's available—if anybody can get to it—we'll look at it."

Kriner said much of the work he and other researchers do isn't really focused on tracking individuals anyway.

"I think ultimately, everybody wants to say we're trying to stop a shooter from shooting, right?" he said. "It's a romantic notion. It's one that can occur, but it's not typically the one that we're trying to accomplish."

Instead, Kriner said, extremism researchers focus more on movements and trends. Their job is to help the public understand the connections between the Unite The Right rally in Charlottesville in 2017 and the Jan. 6 insurrection, for example, he said.

"What law enforcement does is look at islands based on the critical thresholds that you need for investigations," Kriner said. "What we're doing is we're looking at the broader landscape."

How they survive: Support in a 'never-ending cycle'

The work this network of researchers does can be physically and emotionally draining. And there's always more work than they can possibly do.

"It's completely overwhelming," Squire said. "I could clone myself six, eight times over, and it wouldn't be enough people."

Rather than compete, extremism researchers instead tend to collaborate—reaching out to one another to swap ideas and share tips. In the wake of a domestic terrorism incident, or after a big leak of hacked data from an extremist group, the network swings into action to try and learn as much as possible.

"A lot of what we've tried to do is put people in contact with one another, to have really granular discussions about 'Why does it matter that this person put X symbol on their gun?' 'What are we seeing across the table?'" Kriner said. "We share tips and tricks, we talk to each other about what works and what doesn't, and why it's important for us to be shifting to this topic area versus that topic area."

But the collaboration isn't just academic, Mendela said "These guys are my rock," he said. "We often discuss things from our personal life, personal achievements. We hold dungeons and dragons games as a good way for us to connect off the clock."

The extremism watchers know they're never going to "win." Their work will never eradicate hate and prejudice. They will never know or understand all the extremists or even all the in America. But at the end of the day, at least they're doing something.

"It's a never-ending battle and we're pretty much just pawns in their little game," White said. "In terms of end goals, of course, you want world peace and everything like that, but it's just a never-ending cycle."

(c)2022 USA Today

Distributed by Tribune Content Agency, LLC.

Citation: The extremist watchers: How a network of researchers is searching for the next hate-fueled attack (2022, July 5) retrieved 26 June 2024 from https://phys.org/news/2022-07-extremist-watchers-network-hate-fueled.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Far-right groups move to messaging apps as tech companies crack down on extremist social media

16 shares

Feedback to editors