Study: Can Criminal Justice Algorithms be Retooled to Serve the Accused?

A prototype risk assessment instrument created for the study by the ACLU and collaborators at Carnegie Mellon and the University of Pennsylvania  predicts the likelihood of an especially lengthy sentence for a person accused of a federal crime.

Study: Can Criminal Justice Algorithms be Retooled to Serve the Accused?

A recent peer-reviewed study by the American Civil Liberties Union (ACLU) and collaborators at Carnegie Mellon and the University of Pennsylvania  suggests that existing risk assessment instruments commonly used to guide the life-altering decisions made by judges, prosecutors, and parole boards about the liberty of the people before them could be retooled to predict the risk of the criminal justice system to the people accused by it, reports the ACLU.

The prototype risk assessment instrument created for the study predicts a person accused of a federal crime’s risk of receiving an especially lengthy sentence based on factors that should be legally irrelevant in determining the sentence length, like the accused person’s race or the party of the president who appointed the judge.

The predictive accuracy of the instrument matches or exceeds that of many tools deployed across the country.

The study sponsors say that if public defenders were provided with the risk that a defendant will get a severe sentence or just how far they are from other similar cases, perhaps it could help them to make informed decisions when navigating sentencing proceedings and plea bargaining.

The model can also indicate unreasonably long sentences, even for people left out of many criminal justice reforms and clemency actions, such as those who have been sentenced for violent crimes.

However, through their research, the group also found that any time a tool is made, it is likely to center the viewpoint of its creators and create a new policy, which should make anyone wary of these tools and their applications.