Predictive Policing

Home » Blog » Predictive Policing

Robot judges in Estonia? American AI sentencing criminals to prison? Computers predicting crimes before they even happen? One may be forgiven for thinking such concepts come straight from a science fiction novel, or the rabid rantings of an online conspiracy theorist. The truth is they are all part of today’s reality. And it looks like it’s only a matter of time before concepts like predictive policing and artificial intelligence will be an everyday feature of justice systems worldwide.

Robot judges in Estonia? American AI sentencing criminals to prison? Computers predicting crimes before they even happen? One may be forgiven for thinking such concepts come straight from a science fiction novel, or the rabid rantings of an online conspiracy theorist. The truth is they are all part of today’s reality. And it looks like it’s only a matter of time before concepts like predictive policing and artificial intelligence will be an everyday feature of justice systems worldwide.

In 2017, computer scientists at the University College London developed an AI judge, said to be capable of weighing up evidence, arguments and weighty dilemmas of right and wrong with 79% accuracy (compared to cases decided in the old-fashioned way). At that time, such AI judges were strictly the stuff of academic research. But in 2018, the Estonian government announced it had been testing and developing a pilot program to have artificially intelligent judges issue actual court rulings. The program has small debt disputes adjudicated by AI, subject to a subsequent appeal to an actual real-life human being judge. Nowadays, in many American criminal courts, it has become almost routine practice for AI systems to recommend sentencing and other guidelines, which are used by judges to reach their decisions. These “risk assessment tools” look at a defendant’s profile (including race, age, gender, and where they live), to deliver a recidivism score, a numerical estimate of how likely it is that person will re-offend. The judge may use this score, based on those criteria, to determine whether a person should be granted bail before trial and, if convicted, how long their sentence should be.

And computerised “justice” doesn’t just stop there.

In recent times, countries like Australia, Britain and the USA have adopted into standard policing procedures the concept of “predictive policing,” in which vast amounts of data are pumped into specialised programs which utilise algorithms to calculate and predict where police departments should allocate their resources, and at what times. Some such systems use facial recognition software to help identify “potential suspects” based on gender, age, race, history and economic circumstances. So if a white male aged between 18 and 25, from poor socio-economic circumstances, and historically known for criminal behaviour, is in a particular area, they will be flagged as a suspect. Japan is reportedly looking at this type of predictive system as it cranks up for 2020 Tokyo Olympics, in the hope of targeting criminal suspects, even before they actually do anything wrong.

Undoubtedly, such computer profiling, which processes data based on history, suspicion and presumption, is absolutely cutting-edge, and very clever. But is it just? Or does it impermissibly discriminate against the marginalised and disadvantaged in our community?

Last year, NSW Police identified around 400 children as requiring “pro-active attention” through the Suspect Target Management Program, a type of predictive policing software. But, while just 5.6% of children in New South Wales are aboriginal, 51.5% of the 400 young people targeted by the STMP program were indigenous Australians. Once on the list all of them were likely to be routinely stopped and questioned by police, detained, and even visited at their home, perhaps on multiple occasions, and not necessarily for any emergent reason. The result has raised concerns such systems, which can only spit out results based on data we put into them, are actually entrenching prejudice, racism and discrimination in society, and thereby further disadvantaging the already-disadvantaged.

There is no doubt artificial intelligence is an invaluable tool to further human achievement in all fields, including justice. The danger is that we allow it to also advance, enhance and ultimately entrench our biases.

Natasha Dawson

Queensland Criminal Lawyer

Share the post

Recent Posts

Categories

Categories

More To Read

Our Alex Somers and Nicola Ellis have been named as finalists in the Lawyers Weekly 30 Under 30 Awards 2024. Alex has been named a finalist in the Criminal category, and Nicola in the Real Estate category.   The Lawyers Weekly 30 Under 30 Awards 2024 is an exciting nationwide recognition program showcasing outstanding young guns and […]

“Chris Nyst took to the stage – along with film producer Trish Lake and award-winning actor David Wenham - to take part in a lively Q&A, hosted by local radio personality Bern Young, to announce the long-awaited sequel to their 2003 crime/comedy triumph, Gettin’ Square.”
“The new Body Corporate and Community Management and Other Legislation Amendment Bill 2023, introduced in recent weeks by Queensland Attorney-General Yvette D’Ath, has got plenty of prospective property developers more than a little excited.”

Recent Posts

Categories

Contact us and see
how we can help

Whether your matter is civil, criminal or commercial in nature, our team at Nyst Legal has all the experience, expertise and diligence necessary to ensure that you achieve the absolute best available result.