More and more studies continue to be published by independent research sources finding pretrial risk assessments should not be used in the criminal justice system at this time. Why? There are extreme concerns regarding current risk assessments being used. According to a new report by Partnership on AI (PAI), a nonprofit organization, three broad categories must be considered for all risk assessments.

Concerns about the validity, accuracy, and bias in the tools themselves; Issues with the interface between the tools and the humans who interact with them; Questions of governance, transparency, and accountability.

The study showed “there are serious shortcomings of risk assessment tools in the U.S. criminal justice system”. “Despite numerous deeply concerning problems and limitations” of risk assessment tools, several jurisdictions and even one state passed legislation mandating their use.  Yet, more and more jurisdictions and legislators, in deviation to findings and concerns, are putting their faith into biased, black box, non-transparent, algorithms. “You need to understand as you’re deploying these tools that they’re extremely approximate, extremely inaccurate.” said Peter Eckersley, research director at Partnership on A.I., a consortium of Silicon Valley heavyweights and civil liberties groups that helped published the report.

One simple explanation from the report should have legislators and jurisdictions sprinting away from risk assessment use. “Quantitatively, accuracy is usually defined as the fraction of correct answers the model produces among all the answers it gives. So a model that answers correctly in 4 out of 5 cases would have an accuracy of 80%. Interestingly, models which predict rare phenomena (like violent criminality) can be incredibly accurate without being useful for their prediction tasks. For example, if only 1% of individuals will commit a violent crime, a model that predicts that no one will commit a violent crime will have 99% accuracy even though it does not correctly identify any of the cases where someone actually commits a violent crime.”

“You need to understand as you’re deploying these tools that they’re extremely approximate, extremely inaccurate.” Peter Eckersley, research director at Partnership on A.I.

Another recent study by University of Indiana, Perdue had similar findings to the example above. The study found the risk assessment tool used in Indiana is only accurate about two-thirds of the time. For “high risk” individuals the accuracy was about a coin flip. It also could not correctly identify any single case.

New Jersey’s most recent report noted new criminal activity while released pretrial and failure to appear rates increased since the state began using risk assessments. If risk assessments are accurate, why would pretrial failures increase?

Numerous social justice groups, scholars, and organizations are opposed to risk assessment use for all the reasons above. Proof has been provided in many scholarly studies that a flip of a coin is as accurate as a risk assessment in many cases.

With all this information why would legislators and jurisdictions continue to implement the use of risk assessments when many studies prove accuracy is inadequate, they are biased, lack transparency, and are ineffective at correctly identifying, with confidence, a defendant who will adhere to all pretrial release requirements? That answer is unknown. However, when billionaires like John Arnold, Mark Zuckerberg, Richard Branson, the Koch Brothers, and others are spending millions in an attempt to change a highly effective and successful pretrial release system that has been in place for over two hundred years, there must be money involved.

Of course, secret algorithm advocates, like the billionaires above, will argue that our current pretrial release process is less than perfect. True. Nor is our jury system. Are those same billionaires in favor of algorithms determining guilt or innocence?  Is that their next venture? Replace judges, attorneys and jury’s with system engineers who feed data into computers? Seem farfetched? Seem unlikely? Who would have thought it would happen with pretrial release and yet it has.

Though less than perfect, I am certain all those billionaires would put much more faith into their attorney, arguing for their release, before an experienced judge over a secret algorithm, that is wrong in one-third of all release decisions, and that their attorney has no ability to question.

https://www.kqed.org/news/11742529/report-warns-a-i-algorithms-not-quite-ready-for-prime-time-in-criminal-justice

About Ken Berke

Ken Berke is a licensed bail agent in Florida. As Executive Vice President of Roche Surety & Casualty Co. Inc. he frequently travels throughout the United States meeting with bail agents and the public to increase awareness of the professionalism and importance of the bail industry to victims, defendants and the judicial system. He is a frequent contributor to the Roche Surety blog where he dispels false claims against the 8th Amendment and bail through facts, statistics and logic.