Predictive policing is tainted by 'dirty data,' study finds

police officer
Credit: CC0 Public Domain

A new study from New York University School of Law and NYU's AI Now Institute concludes that predictive policing systems run the risk of exacerbating discrimination in the criminal justice system if they rely on "dirty data."

Law enforcement has come under scrutiny in recent years for practices resulting in disproportionate aggression toward minority suspects, causing some to ask whether technology – specifically, predictive policing software – might diminish discriminatory actions.

However, a new study from New York University School of Law and NYU's AI Now Institute concludes that predictive policing systems, in fact, run the risk of exacerbating discrimination in the if they rely on "dirty data" – data created from flawed, racially biased, and sometimes unlawful practices.

The researchers illustrate this phenomenon with case study data from Chicago, New Orleans, and Arizona's Maricopa County. Their paper, "Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice," is available on SSRN.

"We chose these sites because we found an overlap between extensively documented evidence of corrupt or unlawful police practices and significant interest, development, and current or prior use of predictive policing systems. This led us to examine the risks that one would influence the other," explains Jason Schultz, a professor of clinical law and one of the paper's co-authors.

The authors, who include Rashida Richardson, director of policy research at the AI Now Institute, and Kate Crawford, co-director of the AI Now Institute, identified 13 jurisdictions (including the aforementioned case studies) with documented instances of unlawful or biased police practices that have also explored or deployed predictive policing systems during the periods of unlawful activity.

The Chicago Police Department, for example, was under federal investigation for unlawful police practices when it implemented a computerized system that identifies people at risk of becoming a victim or offender in a shooting or homicide. The study revealed that the same demographic of residents who had been identified by the Department of Justice as targets of Chicago's policing bias overlapped with those who were identified by the predictive system.

Other examples showed significant risks of overlap but because government use of predictive policing systems is often secret and hidden from public oversight, the extent of the risks remains unknown, according to the study.

"In jurisdictions that have well-established histories of corrupt police practices, there is a substantial risk that data generated from such practices could corrupt predictive computational systems. In such circumstances, robust public oversight and accountability are essential," Schultz said.

Lead author Richardson added, "Even though this study was limited to jurisdictions with well-established histories of police misconduct and discriminatory practices, we know that these concerns about policing practices and policies are not limited to these jurisdictions, so greater scrutiny regarding the data used in predictive policing technologies is necessary globally."

More information: Rashida Richardson et al. Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice (February 13, 2019). New York University Law Review Online, Forthcoming. Available at SSRN: ssrn.com/abstract=3333423

Citation: Predictive policing is tainted by 'dirty data,' study finds (2019, April 10) retrieved 25 April 2024 from https://phys.org/news/2019-04-policing-tainted-dirty.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Field-data study finds no evidence of racial bias in predictive policing

4 shares

Feedback to editors