Should we use AI in law courts?

A Melbourne law expert says it’s time to look at machine learning to identify bias in criminal sentencing.

Criminal sentencing could be fairer with the help of machine learning, according to Professor Dan Hunter, from Swinburne University’s Law School.

Hunter observed that sentencing generates a vast store of data, and the process is expensive for individuals and the system, making it the perfect candidate for a technological upgrade.

Artificial intelligence (AI) could also use the enormous volume of data available on sentencing decisions to identify bias and give guidance, he said.

Ai in law expert dan hunter
Dan Hunter from Swinburne University’s Law School.

Human sentencing leads to inconsistency

In 2017, Hunter co-authored a paper on using AI technology in sentencing for the Criminal Law Journal with Swinburne colleague, Professor Mirko Bagaric, and Dr Nigel Stobbs from the Queensland University of Technology.

The authors pointed out that sentencing decisions are influenced by more than 200 considerations. While judges and magistrates are reluctant to acknowledge it, decisions may be influenced by factors such as skin colour and socio-economic status.

Inconsistency in sentencing also erodes trust in the system. The authors cite a study of 71,000 offences suggesting one Victorian court was three times more likely to send offenders to prison for the same offence as other courts in the same jurisdiction.

“In things like bail decisions and sentencing decisions, here in Australia, particularly, we haven’t come to grips with the fallibility of human decision-making,” Hunter said.

A technological check

“AI might suggest, ‘This particular offence looks a lot like these five others that other courts have seen, why are you sentencing the offender so differently from those?’”

“Or, it could question why a non-custodial sentence is given for crime that has always previously been punished with incarceration.

“One of the huge benefits of using data-driven machine learning for criminal justice is to start unpacking those biases and making it clear that they exist.”

New machine learning algorithms mean that AI can be taught to produce new answers by learning from existing data. In fact, AI programs are already hard at work in the legal system, with platforms including Neota, Logic, Kira and RAVN helping to streamline everything from compliance advice to contract review in large due-diligence projects.

AI and machine learning will probably start their criminal justice roles as decision-support systems only, Hunter said, although there’s no reason to think this won’t eventually move into automated sentencing.

The sanctions imposed for more than 90% of criminal offences currently do not have any judicial involvement. Most criminal matters are finalised by way of infringement notice.

It’s likely that there will be some backlash against the idea of humans being sentenced by machines, said Hunter, who has been exploring the use of AI in law for more than 20 years. However, he noted that society has readily accommodated many similar technologies. Speed cameras, for example, are a form of automated sentencing that is now widely accepted.

However, there have been concerns overseas about AI-driven sentencing algorithms, with one system removed from service in the United States over concerns it was providing harsher sentences for African-America offenders compared to white. However, AI experts have pointed out that these algorithms reflect the data input into them, and that flaws in the original data – such as biased sentencing by human judges, can flow through into the AI’s decision making process.

This article was first published on Australia’s Science Channel, the original news platform of The Royal Institution of Australia.

Please login to favourite this article.