Keep up-to-date with drugs and crime

The latest research, policy, practice and opinion on our criminal justice and drug & alcohol treatment systems
Search
Can an algorithm replace bail hearings?
New Jersey is trying to address the racial discrimination which is built-in to its bail decision making process by using a specially created algorithm.

 

Many thanks to Sarah Kramer on whose article for Motherboard, this post largely relies.

This post updated on 12 May 2017 to reflect new artificial intelligence approach to police bail in Durham (see end of article).

New Jersey initiative

The US State of New Jersey is trying a new algorithm to fix its broken bail system, a flashpoint for criminal justice advocates who argue that court-assessed  fines can discriminate against low-income and highly policed communities—most often, people of colour.

Guidelines for how judges set bail vary across the US, but generally use a combination of a bail schedule, which prices out fees for specific offences, and their own assessment of whether the defendant will appear at their hearing or commit a crime before their trial. If you can’t pay up, you stay in jail until your trial date, sometimes for up to a month.

An evidence-based approach

On January 1 2017, New Jersey replaced its bail system with an algorithm designed to mathematically assess the risk of defendants fleeing or committing a crime—particularly a violent one—before their trial date. The algorithm, called the Public Safety Assessment, was designed by the Texas-based Laura and John Arnold Foundation, a nonprofit that tries to fund innovative solutions to criminal justice reform.

New Jersey isn’t the first state to use algorithms to help judges identify high-risk defendants. Different jurisdictions across the country have tried using computer-based techniques to flag those who should continue to be detained until trial, and those who are flight risks.

But the algorithms are not without flaws. Last year, investigative reporting by ProPublica revealed that these programs had in-built racial biases, too. The software assessed risk based on data points gleaned from interviews with defendants, including questions about ZIP (postal) codes, educational attainment, and family history of incarceration—all of which can serve as proxies for race.

This system is different, according to Matt Alsdorf, vice president of the foundation’s Criminal Justice Initiative. The initiative assembled a dataset of more than 100,000 individual cases, and looked for factors re-offenders had in common. His team found that the data points that were the most closely correlated with race weren’t actually terribly useful and that:

The strongest predictor of pretrial failure largely has to do with someone’s prior conduct.

The algorithm uses conviction records instead of arrest records, which are less likely to tip the scales against individuals in heavily policed neighborhoods—studies have found that the arrest rate for black people can be up to ten times higher than for non-blacks.

Conclusion

However, a number of expert commentators, while praising the approach in principle, remain to be convinced. Although the Foundation has released their methodology, they haven’t released their dataset for impartial analysis.

Nevertheless, it is clear that algorithms are becoming increasingly common in a wide range of areas including predictive policing and even assessing risk for young drivers.

Could they be the answer to removing some of the race and gender biases which are endemic in our criminal justice system?

 

Durham Police to award bail using artificial intelligence

Police in Durham are preparing to go live with an artificial intelligence (AI) system designed to help officers decide whether or not a suspect should be kept in custody, according to the BBC.

The system classifies suspects at a low, medium or high risk of offending and has been tested by the force.

It has been trained on five years’ of offending histories data.

One expert said the tool could be useful, but the risk that it could skew decisions should be carefully assessed.

Data for the Harm Assessment Risk Tool (Hart) was taken from Durham police records between 2008 and 2012.

The system was then tested during 2013, and the results – showing whether suspects did in fact offend or not – were monitored over the following two years.

Forecasts that a suspect was low risk turned out to be accurate 98% of the time, while forecasts that they were high risk were accurate 88% of the time.

This reflects the tool’s built in predisposition – it is designed to be more likely to classify someone as medium or high risk, in order to err on the side of caution and avoid releasing suspects who may commit a crime.

During the trial period, the accuracy of Hart was monitored but it did not impact custody sergeants’ decisions, said Sheena Urwin, head of criminal justice at Durham Constabulary.

“I imagine in the next two to three months we’ll probably make it a live tool to support officers’ decision making,” she told the BBC

 

All innovation posts are kindly sponsored by Socrates 360 which provides a complete solution for staff, prisoners, probationers, etc. combining engaging content, simple set-up and an easy tracking system. Socrates 360 has no influence over editorial content.

Share This Post

Related posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Innovation posts sponsored by Socrates 360

The smart solution to communication, information, and education in secure settings and beyond.

Socrates Software is  working with Probation Services, Prison Services and some of the UK’s premier private companies bringing innovation and life-changing improvements to the sector by providing a “mobile mentor” via tablets and smartphones for Prisons and the Transforming Rehabilitation Programme.

 

The Future of Resettlement

Socrates 360, mobile mentor, is a true Through The Gates solution for the prison and probation sector. For use by prisoners, probationers and staff.

Subscribe

Get every blog post by email for free