This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

Online content moderators likely to experience burnout, study suggests

moderator
Credit: Pixabay/CC0 Public Domain

Online communities play an important role in creating a sense of connection between strangers. But what happens when the people moderating our favorite online communities quit?

Why do they quit, and what strategies can companies like Meta and Reddit implement to help prevent burnout?

A study by University of Michigan School of Information researchers led by doctoral candidate Angela Schöpke-Gonzalez says volunteer content moderators, or VCMs, experience burnout stemming from interpersonal conflict between moderators, and daily exposure to toxic online behavior.

"It's the unpaid labor of volunteer content moderators that make it possible for us, in many cases, to enjoy environments that support our well-being," Schöpke-Gonzalez said. "We browse the internet every day and many people are on , but we often forget that it's people that are responsible for keeping our information ecosystems alive."

The study aims to point attention to the critical roles of VCMs, to explore what causes burnout and to help companies begin to understand how they can better support VCMs in order to help prevent .

"VCMs experience many of the same psychological distress challenges as crisis hotline volunteer responders, caregivers and volunteer support providers for persons who have experienced violence," Schöpke-Gonzalez said. "Researchers, platforms and moderators can learn from work addressing psychological distress among similar volunteer groups to craft interventions that support VCMs."

Schöpke-Gonzalez's research focuses on how computational social science research processes can avoid perpetuating social harms. She cited the example of an African American man wrongfully accused of stealing watches at a Shinola store in Detroit in 2020.

"What steps can computational social science research take to mitigate the risk of facial recognition algorithms' use in leading to wrongful detentions like that of Michigander Robert Julian-Borchak Williams?" she said.

Schöpke-Gonzalez is working on her with Libby Hemphill, U-M associate professor of information, who co-authored the study along with doctoral candidate Shubham Atreja and former research assistants Han Na Shin and Najmin Ahmed.

More information: Angela M. Schöpke-Gonzalez et al, Why do volunteer content moderators quit? Burnout, conflict, and harmful behaviors, New Media & Society (2022). DOI: 10.1177/14614448221138529

Citation: Online content moderators likely to experience burnout, study suggests (2023, March 8) retrieved 25 April 2024 from https://phys.org/news/2023-03-online-content-moderators-burnout.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Google case at Supreme Court risks upending the internet as we know it

1 shares

Feedback to editors