Mobile dating apps that allow users to filter their searches by race – or rely on algorithms that pair up people of the same race – reinforce racial divisions and biases, according to a new paper by Cornell researchers.
As more and more relationships begin online, dating and hookup apps should discourage discrimination by offering users categories other than race and ethnicity to describe themselves, posting inclusive community messages, and writing algorithms that don't discriminate, the authors said.
"Serendipity is lost when people are able to filter other people out," said Jevan Hutson '16, M.P.S. '17, lead author of "Debiasing Desire: Addressing Bias and Discrimination on Intimate Platforms," co-written with Jessie G. Taft '12, M.P.S. '18, a research coordinator at Cornell Tech, and Solon Barocas and Karen Levy, assistant professors of information science. "Dating platforms have the opportunity to disrupt particular social structures, but you lose those benefits when you have design features that allow you to remove people who are different than you."
The paper, which the authors will present at the ACM Conference on Computer-Supported Cooperative Work and Social Computing on Nov. 6, cites existing research on discrimination in dating apps to show how simple design decisions could decrease bias against people of all marginalized groups, including disabled or transgender people. Although partner preferences are extremely personal, the authors argue that culture shapes our preferences, and dating apps influence our decisions.
"It's really an unprecedented time for dating and meeting online. More people are using these apps, and they're critical infrastructures that don't get a lot of attention when it comes to bias and discrimination," said Hutson, now a student at the University of Washington School of Law. "Intimacy is very private, and rightly so, but our private lives have impacts on larger socioeconomic patterns that are systemic."
Fifteen percent of Americans report using dating sites, and some research estimates that a third of marriages – and 60 percent of same-sex relationships – started online. Tinder and Grindr have tens of millions of users, and Tinder says it has facilitated 20 billion connections since its launch.
Research shows racial inequities in online dating are widespread. For example, black men and women are 10 times more likely to message whites than white people are to message black people. Letting users search, sort and filter potential partners by race not only allows people to easily act on discriminatory preferences, it stops them from connecting with partners they may not have realized they'd like.
Apps may also create biases. The paper cites research showing that men who used the platforms heavily viewed multiculturalism less favorably, and sexual racism as more acceptable.
Users who get messages from people of other races are more likely to engage in interracial exchanges than they would have otherwise. This suggests that designing platforms to make it easier for people of different races to meet could overcome biases, the authors said.
The Japan-based gay hookup app 9Monsters groups users into nine categories of fictional monsters, "which may help users look past other forms of difference, such as race, ethnicity and ability," the paper says. Other apps use filters based on characteristics like political views, relationship history and education, rather than race.
"There's definitely a lot of room to come up with different ways for people to learn about each other," Hutson said.
Algorithms can introduce discrimination, intentionally or not. In 2016, a Buzzfeed reporter found that the dating app CoffeeMeetsBagel showed users only potential partners of their same race, even when the users said they had no preference. An experiment run by OKCupid, in which users were told they were "highly compatible" with people the algorithm actually considered bad matches, found that users were more likely to have successful interactions when told they were compatible – indicating the strong power of suggestion.
In addition to rethinking the way searches are conducted, posting policies or messages encouraging a more inclusive environment, or explicitly prohibiting certain language, could decrease bias against users from any marginalized group. For example, Grindr published an article titled "14 Messages Trans People Want You to Stop Sending on Dating Apps" on its media site, and the gay dating app Hornet bars users from referring to race or racial preferences in their profiles.
Changes like these could have a big impact on society, the authors said, as the popularity of dating apps continues to grow and fewer relationships begin in places like bars, neighborhoods and workplaces. Yet while physical spaces are subject to laws against discrimination, online apps are not.
"A random bar in North Dakota with 10 customers a day is subject to more civil rights directives than a platform that has 9 million people visiting every day," Hutson said. "That's an imbalance that doesn't make sense."
Still, the authors said, courts and legislatures have shown reluctance to get involved in intimate relationships, and it's unlikely these apps will be regulated anytime soon.
"Given that these platforms are becoming increasingly aware of the impact they have on racial discrimination, we think it's not a big stretch for them to take a more justice-oriented approach in their own design," Taft said. "We're trying to raise awareness that this is something designers, and people in general, should be thinking more about."
Explore further: Using Tinder doesn't result in more casual sex
Debiasing Desire: Addressing Bias & Discrimination on Intimate Platforms. Proc. ACM Hum.-Comput. Interact. DOI: 10.1145/3274342