Credit: CC0 Public Domain

People are less morally outraged when gender discrimination occurs because of an algorithm rather than direct human involvement, according to research published by the American Psychological Association.

In the study, researchers coined the phrase "algorithmic outrage deficit" to describe their findings from eight experiments conducted with a total of more than 3,900 participants from the United States, Canada and Norway.

When presented with various scenarios about gender in hiring decisions caused by algorithms and humans, participants were less morally outraged about those caused by algorithms. Participants also believed companies were less legally liable for discrimination when it was due to an .

"It's concerning that companies could use algorithms to shield themselves from blame and public scrutiny over discriminatory practices," said lead researcher Yochanan Bigman, Ph.D., a post-doctoral research fellow at Yale University and incoming assistant professor at Hebrew University. The findings could have broader implications and affect efforts to combat discrimination, Bigman said. The research was published online in the Journal of Experimental Psychology: General.

"People see humans who discriminate as motivated by prejudice, such as racism or sexism, but they see algorithms that discriminate as motivated by data, so they are less morally outraged," Bigman said. "Moral outrage is an important societal mechanism to motivate people to address injustices. If people are less morally outraged about discrimination, then they might be less motivated to do something about it."

Some of the experiments used a scenario based on a real-life example of alleged algorithm-based gender discrimination by Amazon that penalized female job applicants. While the focused on gender discrimination, one of the eight experiments was replicated to examine racial and age discrimination and had similar findings.

Knowledge about didn't appear to make a difference. In one experiment with more than 150 tech workers in Norway, participants who reported greater knowledge about artificial intelligence were still less outraged by discrimination caused by algorithms.

When people learn more about a specific algorithm it may affect their outlook, the researchers found. In another study, participants were more outraged when a hiring algorithm that caused was created by male programmers at a company known for sexist practices.

Programmers should be aware of the possibility of unintended discrimination when designing new algorithms, Bigman said. Public education campaigns also could stress that discrimination caused by algorithms may be a result of existing inequities, he said.

More information: Algorithmic Discrimination Causes Less Moral Outrage than Human Discrimination, Journal of Experimental Psychology (2022). PDF

Journal information: Journal of Experimental Psychology , Journal of Experimental Psychology: General