Can an algorithm replace bail hearings?

Many thanks to Sarah Kramer on whose article for Motherboard, this post largely relies.

New Jersey initiative

The US State of New Jersey is trying a new algorithm to fix its broken bail system, a flashpoint for criminal justice advocates who argue that court-assessed  fines can discriminate against low-income and highly policed communities—most often, people of colour.

Guidelines for how judges set bail vary across the US, but generally use a combination of a bail schedule, which prices out fees for specific offences, and their own assessment of whether the defendant will appear at their hearing or commit a crime before their trial. If you can’t pay up, you stay in jail until your trial date, sometimes for up to a month.

An evidence-based approach

On January 1 2017, New Jersey replaced its bail system with an algorithm designed to mathematically assess the risk of defendants fleeing or committing a crime—particularly a violent one—before their trial date. The algorithm, called the Public Safety Assessment, was designed by the Texas-based Laura and John Arnold Foundation, a nonprofit that tries to fund innovative solutions to criminal justice reform.

New Jersey isn’t the first state to use algorithms to help judges identify high-risk defendants. Different jurisdictions across the country have tried using computer-based techniques to flag those who should continue to be detained until trial, and those who are flight risks.

But the algorithms are not without flaws. Last year, investigative reporting by ProPublica revealed that these programs had in-built racial biases, too. The software assessed risk based on data points gleaned from interviews with defendants, including questions about ZIP (postal) codes, educational attainment, and family history of incarceration—all of which can serve as proxies for race.

This system is different, according to Matt Alsdorf, vice president of the foundation’s Criminal Justice Initiative. The initiative assembled a dataset of more than 100,000 individual cases, and looked for factors re-offenders had in common. His team found that the data points that were the most closely correlated with race weren’t actually terribly useful and that:

The strongest predictor of pretrial failure largely has to do with someone’s prior conduct.

The algorithm uses conviction records instead of arrest records, which are less likely to tip the scales against individuals in heavily policed neighborhoods—studies have found that the arrest rate for black people can be up to ten times higher than for non-blacks.

Conclusion

However, a number of expert commentators, while praising the approach in principle, remain to be convinced. Although the Foundation has released their methodology, they haven’t released their dataset for impartial analysis.

Nevertheless, it is clear that algorithms are becoming increasingly common in a wide range of areas including predictive policing and even assessing risk for young drivers.

Could they be the answer to removing some of the race and gender biases which are endemic in our criminal justice system?

 

All innovation posts are kindly sponsored by Socrates 360 which provides a complete solution for staff, prisoners, probationers, etc. combining engaging content, simple set-up and an easy tracking system. Socrates 360 has no influence over editorial content.

Leave a Reply

UA-7573595-1
Select Language

Keep up to date on drugs, crime & PbR

Every new post straight to your inbox