5 Reasons Why Bail Reform Risk Assessment Tools Are Dangerous to Democracy

5 Reasons Why Bail Reform Risk Assessment Tools Are Dangerous to Democracy

Criminal justice risk assessment tools are algorithms designed to recommend mandatory release of a defendant who scores as a low risk for flight or criminal activity. In states that have implemented the tool in the last few years, the algorithm has forced judges to release dangerous people back on the streets.

Robert Werth of Houston’s Rice University has published a new study that proves that risk assessment tools, which are powered by artificial intelligence, have hidden bias and are only as good as the data that humans feed them.

Houston Chronicle staff writer Keri Blakinger interviewed Werth about the study that was published in the journal Social & Legal Studies and there are clear concerns about the fairness of risk assessment tools and mandatory catch and release laws.

Risk Assessment Tools Are Too Risky!

  1. We don’t know how the algorithms calculate risk because the formula is developed by a private company and proprietary. Defendants cannot even contest the result.
  2. Crimes like murder have a low-recidivism rate, which can cause the risk assessment tool to score lower on the likelihood that a person will reoffend even though they committed a fatal crime.
  3. Race, gender, and age are factored into the algorithm which often creates bias. Many studies are already reporting data that show racial bias.
  4. In many states judges must defer to the results of the risk assessment tool and give a defendant mandatory release without cash bail to help insure their return to court.
  5. There is no possibility for zero risk. These tools change the question of “is this person dangerous” to “how dangerous is this person”

Werth challenges the argument that risk assessment tools are more fair because they do away with subjective judgments of human bias. The data that is used to create the algorithms are supplied by prior arrest data which is driven by what crimes police in any given community choose to prioritize. In affect, the arrest data is already biased, so the algorithm reproduces the biases in policing and prosecution. And even more dangerous to our justice system, hides the bias from everyone, including the judge and defendant.  

Leave a Comment