Setting a precedent in the use of artificial intelligence

Setting a precedent in the use of artificial intelligence
Professor Dan Hunter says it’s time to look at machine learning to identify bias in criminal sentencing. Credit: Swinburne University of Technology

Criminal sentencing could be fairer with the help of machine learning, according to Professor Dan Hunter. The Foundation Dean of Swinburne Law School, Hunter observed that sentencing generates a vast store of data, and the process is expensive for individuals and the system, making it the perfect candidate for a technological upgrade.

Artificial intelligence (AI) could also use the enormous volume of data available on decisions to identify bias and give guidance, he said.

In 2017, Hunter co-authored a paper on using AI technology in sentencing for the Criminal Law Journal with Swinburne colleague, Professor Mirko Bagaric, and Dr. Nigel Stobbs from the Queensland University of Technology.

The authors pointed out that sentencing decisions are influenced by more than 200 considerations. While judges and magistrates are reluctant to acknowledge it, decisions may be influenced by factors such as and socio-economic status.

Inconsistency in sentencing also erodes trust in the system. The authors cite a study of 71,000 offences suggesting one Victorian court was three times more likely to send offenders to prison for the same offence as other courts in the same jurisdiction.

"In things like bail decisions and sentencing decisions, here in Australia, particularly, we haven't come to grips with the fallibility of human decision-making," Hunter said.

"AI might suggest, 'This particular offence looks a lot like these five others that other courts have seen, why are you sentencing the offender so differently from those?'"

"Or, it could question why a non-custodial sentence is given for crime that has always previously been punished with incarceration.

"One of the huge benefits of using data-driven machine learning for criminal justice is to start unpacking those biases and making it clear that they exist."

New machine learning algorithms mean that AI can be taught to produce new answers by learning from existing data. In fact, AI programs are already hard at work in the legal system, with platforms including Neota, Logic, Kira and RAVN helping to streamline everything from compliance advice to contract review in large due-diligence projects.

AI and will probably start their roles as decision-support systems only, Hunter said, although there's no reason to think this won't eventually move into automated sentencing.

The sanctions imposed for more than 90 per cent of criminal offences currently do not have any judicial involvement. Most criminal matters are finalised by way of infringement notice.

It's likely that there will be some backlash against the idea of humans being sentenced by machines, said Hunter, who has been exploring the use of AI in law for more than 20 years. However, he noted that society has readily accommodated many similar technologies. Speed cameras, for example, are a form of automated sentencing that is now widely accepted.

Citation: Setting a precedent in the use of artificial intelligence (2019, May 10) retrieved 29 March 2024 from https://phys.org/news/2019-05-artificial-intelligence.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Study identifies racial bias in US court sentencing decisions

12 shares

Feedback to editors