Evidence reveals risk assessment algorithms show bias against Hispanic population

crowd
Credit: CC0 Public Domain

Automated risk assessment has become increasingly popular in the criminal justice system, but a new study published in the American Criminal Law Review assessed the accuracy, validity and predictive ability of a risk assessment algorithm tool to reveal algorithmic unfairness against Hispanics.

Risk assessment can be an objective way to reduce rates of imprisonment without jeopardising public safety, and officials are increasingly reliant on algorithmic processing to inform decisions on managing offenders according to their risk profiles. However, there is alarming evidence to suggest that risk algorithms are potentially biased against minority groups.

Reader in Law and Criminal Justice at the University of Surrey Dr. Melissa Hamilton used a large dataset of pre-trial defendants who were scored on COMPAS—a widely-used algorithmic risk assessment tool—soon after their arrests to evaluate the impact of this algorithmic tool specifically on the Hispanic minority group.

Dr. Hamilton said: "There is a misconception that algorithmic risk assessment tools developed using automatically represent a transparent, consistent and logical method for classifying offenders. My research suggests that risk tools can deliver unequal results for groups if they fail to consider their cultural differences. Bias occurs when risk tools are normed largely on one group, for example White samples, as they provide inaccurate predictions for as a result.

"Cumulative evidence showed that COMPAS consistently exhibited unfair and biased algorithmic results for those of Hispanic ethnicity, with statistics presenting differential validity and differential predictive ability. The tool fails to accurately predict actual outcomes, subsequently overpredicting the level of risk of reoffending for Hispanic pre-trial defendants."

Whilst there have been impressive advances in behavioural sciences, the availability of big data and statistical modelling, justice officials should be aware that greater care is needed to ensure that proper validation studies are conducted prior to the use of an algorithmic risk tool, to confirm that it is fair for its intended population and subpopulations.


Explore further

Criminal justice system should be cautious when approaching risk assessment

More information: The Biased Algorithm: Evidence of Disparate Impact on Hispanics. American Criminal Law Review: www.law.georgetown.edu/america … impact-on-hispanics/
Citation: Evidence reveals risk assessment algorithms show bias against Hispanic population (2019, August 12) retrieved 23 August 2019 from https://phys.org/news/2019-08-evidence-reveals-algorithms-bias-hispanic.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
3 shares

Feedback to editors

User comments

Aug 12, 2019
Algorithms to predict criminal behavior produce "unfair" results for minority groups if they "fail to consider their cultural differences".
In other words, criminality is an inherent part of some minority cultures and so should be overlooked if they engage in it?
It can be said that there can be a "bias" in risk assessment against such as Hispanics if a person developing a policy wants to target Hispanics. But there can also be a "bias" if Hispanics are unnaturally more likely to engage in crime! There is a "bias" against career offenders, especially repeat offenders in areas like rape and murder. Is that necessarily seen as "unfair"? "Bias" is not necessarily a hateful thing!

Aug 13, 2019
@julianpenrod.

First, a HUGE criticism of sentencing algorithms is that they are, for the most part, black boxes. Defense attorneys have been struggling to find out exactly how sentences are calculated with these algorithms and have been unable in many instances to do so.

Second, the algorithms include multiple factors (weighed according to some programmer?), recidivism rates amongst them. But, how are those calculated? Anyone know? Is it one recidivist rate for all ethnic/racial/cultural/hair color groups? Even though dyed blonde blacks may have a lower recidivist rate than natural blonde second generation Swedes? Did the algorithm include the possession with intent to distribute charge dropped against a 17 year old who just made the pot buy for all of his buddies at one time? Family support is probably considered. Culturally speaking, is the recidivist rate for latinos lacking family support the same as a black or white or Chinese person lacking the same?

Don't be ignorant. Read.

Aug 13, 2019
@julianpenrod, again.

You wrote: "Algorithms to predict criminal behavior produce "unfair" results for minority groups if they "fail to consider their cultural differences".
In other words, criminality is an inherent part of some minority cultures and so should be overlooked if they engage in it?"

The opposite side of that coin, you tool, is that by failing to include such "criminality" by culture, race, or whatever, you are also penalizing others, right? So, if the extreme criminality of one culture is lumped into one factor and applied indiscriminately, someone from a culture - if one could be found to exist - where there is no recidivism is punished for the cultural behaviors of others.

Are you starting to see the problem with your simplism?

Aug 13, 2019
and to continue, someone from a culture/environment where recidivism is common benefits from lumping everyone together and gets a lighter sentence, right?

duh.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more