This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

AI-powered hiring process perceived as more fair when blind to race or gender

robot interview
Credit: Pixabay/CC0 Public Domain

Job applicants can be suspicious of the hiring process if a company uses artificial intelligence to pre-screen candidates and facilitate hiring decisions, a Northeastern University expert says, but their perception improves when they learn that an algorithm is "blind" to such characteristics as gender, race or age.

A group of researchers, including Yakov Bart, a professor of marketing at Northeastern, conducted a behavioral experiment to see how people's perception of fairness changes depending on what they are told about the algorithm used in the hiring process.

"Our findings indicate that people perceive hiring algorithms as procedurally fairest when companies adopt a 'fairness through unawareness' approach to mitigating bias," Bart says. "They are also likely to view companies who use this approach more positively and are more motivated to apply for open positions."

AI algorithms have enabled companies to automate some aspects of the hiring process, Bart says. They have become a part of the hiring decision-making that affects individuals and their lives.

Job applicants and related stakeholders usually feel great concern about potential bias and in the , he says, especially when it involves . Prospective job candidates are hesitant to engage with technologies like AI in their everyday lives unless they perceive algorithms as behaving fairly.

This subjective perception of fairness is very important, Bart says, because candidates' opinions on this can negatively affect a company's reputation and ability to attract the best talent, even if it uses objectively fair hiring algorithms.

"At the end of the day, if something is objectively true, but people perceive it differently, they behave based on how they perceive things, not based on how things actually are," Bart says.

Bart and his co-authors, Lily Morse from the University of Denver and Mike Teodorescu from the University of Washington, tested three scenarios with different algorithmic fairness conditions.

First, the participants were told that the algorithm is designed to follow fairness-through-unawareness rules.

"We say that the algorithms do not consider applicants for employment based upon race, gender, age and other characteristics protected by law," Bart says. "So they're 'blind,' and we explained that by remaining blind, the algorithms ensure that applicants are treated equally."

In the second scenario, candidates were told that the algorithm is based on demographic parity, or equality. The algorithm continually screens for potential disparities based on this protected characteristics to ensure equal outcomes. For example, the algorithm ensures that all applicants have similar rates of being selected for an interview regardless their gender.

"Fairness is in the eyes of the beholder. So, the company may think that implementing demographic parity is fair or equal," Bart says. "But an average job , they may disagree with it."

The third scenario that the researchers tested was based on the idea of equality of opportunity. In the situation of equal opportunity, the ensures that applicants who are equally qualified for the position have the same chances to be selected for an interview regardless of their gender or other protected individual characteristics.

The fairness-through-unawareness approach produced the most positive result. Other scenarios were either unpopular or didn't cause a change in opinion of the study participants, compared with the control scenario.

"When we split the results across gender categories, women and non-binary-identifying individuals show the most positive effect and are driving most of the effects," Bart says.

For males, the effect was insignificant, he says, although the researchers have not yet identified why, hoping to shed more light on this in the future studies.

Bart's recommendation for companies is to consider adopting this fairness of unawareness approach.

"What companies may want to do is to take note of this pattern that will show that the highest potential to attract people toward the company is to use fairness-through-unawareness explanation in their hiring algorithms," Bart says. "And, of course, it's not just an explanation, but what the company should be consistent with."

More information: Paper: www.brookings.edu/articles/per … procedural-fairness/

This story is republished courtesy of Northeastern Global News news.northeastern.edu.

Citation: AI-powered hiring process perceived as more fair when blind to race or gender (2024, July 17) retrieved 17 July 2024 from https://phys.org/news/2024-07-ai-powered-hiring-fair-gender.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

To hire the right job candidate, humans and machines should clear up this simple miscommunication

9 shares

Feedback to editors