Evidence reveals risk assessment algorithms show bias against Hispanic population

crowd
Credit: CC0 Public Domain

Automated risk assessment has become increasingly popular in the criminal justice system, but a new study published in the American Criminal Law Review assessed the accuracy, validity and predictive ability of a risk assessment algorithm tool to reveal algorithmic unfairness against Hispanics.

Risk assessment can be an objective way to reduce rates of imprisonment without jeopardising public safety, and officials are increasingly reliant on algorithmic processing to inform decisions on managing offenders according to their risk profiles. However, there is alarming evidence to suggest that risk algorithms are potentially biased against minority groups.

Reader in Law and Criminal Justice at the University of Surrey Dr. Melissa Hamilton used a large dataset of pre-trial defendants who were scored on COMPAS—a widely-used algorithmic risk assessment tool—soon after their arrests to evaluate the impact of this algorithmic tool specifically on the Hispanic minority group.

Dr. Hamilton said: "There is a misconception that algorithmic risk assessment tools developed using automatically represent a transparent, consistent and logical method for classifying offenders. My research suggests that risk tools can deliver unequal results for groups if they fail to consider their cultural differences. Bias occurs when risk tools are normed largely on one group, for example White samples, as they provide inaccurate predictions for as a result.

"Cumulative evidence showed that COMPAS consistently exhibited unfair and biased algorithmic results for those of Hispanic ethnicity, with statistics presenting differential validity and differential predictive ability. The tool fails to accurately predict actual outcomes, subsequently overpredicting the level of risk of reoffending for Hispanic pre-trial defendants."

Whilst there have been impressive advances in behavioural sciences, the availability of big data and statistical modelling, justice officials should be aware that greater care is needed to ensure that proper validation studies are conducted prior to the use of an algorithmic risk tool, to confirm that it is fair for its intended population and subpopulations.


Explore further

Criminal justice system should be cautious when approaching risk assessment

More information: The Biased Algorithm: Evidence of Disparate Impact on Hispanics. American Criminal Law Review: www.law.georgetown.edu/america … impact-on-hispanics/
Citation: Evidence reveals risk assessment algorithms show bias against Hispanic population (2019, August 12) retrieved 15 December 2019 from https://phys.org/news/2019-08-evidence-reveals-algorithms-bias-hispanic.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
3 shares

Feedback to editors

User comments