This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

proofread

An avatar will never lie, or will it? Scientists investigate how often we change our minds in virtual environments

computer avatar
Credit: Pixabay/CC0 Public Domain

How confident are you in your judgments and how well can you defend your opinions? Chances are that they will change under the influence of a group of avatars in a virtual environment. Scientists from SWPS University have investigated the human tendency to be influenced by the opinions of others, including virtual characters.

"We usually conform to the views of others for two reasons. First, we succumb to group pressure and want to gain . Second, we lack sufficient knowledge and perceive the group as a source of a better interpretation of the current situation," explains Dr. Konrad Bocian from the Institute of Psychology at SWPS University.

So far, only a few studies have investigated whether , or evaluations of another person's behavior in a given situation, are subject to group pressure. This issue was examined by scientists from SWPS University in collaboration with researchers from the University of Sussex and the University of Kent. The scientists also investigated how views about the behavior of others changed under the influence of pressure in a virtual environment. A paper on this topic is published in PLOS ONE.

"Today, is increasingly as potent in the as in the . Therefore, it is necessary to determine how our judgments are shaped in the digital reality, where interactions take place online and some participants are avatars, not real humans," points out Dr. Bocian.

Do others know better?

In the first study, the researchers tested the extent to which participants—a total of 103 people—would change their private moral judgments to conform with the judgments of others. First, participants independently judged specific behavior, such as a woman punishing her child for getting bad grades in school or a man answering the phone and talking loudly in a cinema. Then, participants judged the same behaviors in groups with three other people who responded in a completely different way than the participant did in the first part of the study.

"Participants adjusted their opinions to conform with others in 43% cases. However, they did it less often when the judgments concerned situations in which other people were harmed," says Dr. Bocian.

Under pressure from avatars

The second study repeated the experiment with 138 participants in a virtual environment. Each participant first judged the behavior of other people in a given situation, and then—after putting on a VR headset—did it again in the presence of three avatars in a .

Some of the avatars were allegedly controlled by humans; the remaining avatars were AI-controlled. In the latter case, participants were told that the Kent School of Engineering and Digital Arts wanted to run tests on their new three algorithms, which were implemented in the virtual avatars.

"It turned out that participants changed their judgments to align them with judgments of human-controlled avatars in 30% cases, and in 26 percent cases when avatars were controlled by AI. The results suggest that judgments about moral , like other judgments we make, are subject to pressure from both real and virtual groups," says Dr. Bocian.

Researchers emphasize that further research is needed to determine the extent to which groups can influence the judgments of others in a digital setting, and in particular the social consequences of such influence in the era of rapid growth of digital communication, which may soon move to different metaverses.

"Group to influence private moral judgments of individuals in a virtual world can be used for both good and malicious purposes. This is why understanding the mechanisms of this influence is so important. Only with in-depth knowledge can we increase the awareness of virtual world participants about the influence that others can have on them," the researcher concludes.

More information: Konrad Bocian et al, Moral conformity in a digital world: Human and nonhuman agents as a source of social pressure for judgments of moral character, PLOS ONE (2024). DOI: 10.1371/journal.pone.0298293

Provided by SWPS University

Citation: An avatar will never lie, or will it? Scientists investigate how often we change our minds in virtual environments (2024, March 18) retrieved 27 April 2024 from https://phys.org/news/2024-03-avatar-scientists-minds-virtual-environments.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

ChatGPT statements can influence users' moral judgments

66 shares

Feedback to editors