Close Close Comment Creative Commons Donate Email Add Email Facebook Instagram Mastodon Facebook Messenger Mobile Nav Menu Podcast Print RSS Search Secure Twitter WhatsApp YouTube
Protect Independent Journalism Spring member drive deadline: Friday
Donate Now

Wisconsin Court: Warning Labels Are Needed for Scores Rating Defendants’ Risk of Future Crime

The court said judges can look at the scores – so long as their limitations are made clear.

The Wisconsin Supreme Court on Wednesday raised concerns about a risk assessment tool that scores criminal defendants on their likelihood of committing future crimes and is increasingly being used during sentencing. The court said judges may consider such scores during sentencing, but it said that warnings must be attached to the scores to flag the tool’s “limitations and cautions.” (Read the opinion.)

The court’s ruling cited a recent ProPublica investigation into COMPAS, the popular software tool used to score defendants in Wisconsin and in other jurisdictions across country. Our analysis found that the software is frequently wrong, and that it is biased against black defendants who did not commit future crimes – falsely labeling them as future criminals at twice the rate as white defendants. (The software is owned by a for-profit company, Northpointe, which disputes our findings.)

Northpointe’s software is just one of many risk and needs assessment tools currently in use across the country. These tools are used in different stages of the criminal justice system in various jurisdictions. In Wisconsin, Northpointe’s software is used at every decision point in the prison system, from sentencing to parole.

Machine Bias

There’s software used across the country to predict future criminals. And it’s biased against blacks. Read more.

Risk and needs assessment scores were designed to make the criminal justice system fairer, by providing evidence-based methods to aid decisions about defendants. But there are few standards to ensure the underlying tests are accurate and transparent. The exact formula underlying Northpointe’s software is proprietary. That means many defendants are getting rated as potential future criminals without knowing the basis for their scores.

Using risk assessment tools to inform sentencing is perhaps the most controversial aspect of their adoption, especially since the manufacturers of many of the tools themselves say that this is not their intended use.

The case decided in Wisconsin on Wednesday was brought by Eric Loomis, who pleaded guilty to driving a stolen car and evading police in 2013. At sentencing, Loomis’ trial judge in La Crosse County cited his high risk scores as justification for giving Loomis six years in prison for those crimes, plus another two and a half years for violating his parole. Judge Scott Horne said Loomis had been “identified, through the COMPAS assessment, as an individual who is at high risk to the community.”

Loomis and his lawyers challenged the sentence, saying in part that his due process rights had been violated by the judge’s reliance on an opaque algorithm that generated a score he couldn’t directly challenge.

In Wednesday’s opinion, the Wisconsin Supreme Court rejected Loomis’ arguments, writing that the risk scores can be used in conjunction with other considerations, if they are used properly. But the justices also cautioned that due process could be violated in future cases if judges don’t fully understand the limitations of the tool. “Although we ultimately conclude that a COMPAS risk assessment can be used at sentencing, we do so by circumscribing its use,” Justice Ann Walsh Bradley wrote.

The court wrote that ProPublica’s analysis and others “raise concerns regarding how a COMPAS assessment’s risk factors correlate with race.” It also noted that Wisconsin has not tested and calibrated the software specifically for the state’s population.

The court ruled that judges looking at the scores during sentencing must get the following warnings about COMPAS’ accuracy: the fact that it is a proprietary tool whose inner workings may not be transparent; that the tool has not yet been cross-validated for Wisconsin’s population; that studies have raised questions about potential racial disproportionality; and that risk assessment tools should be constantly tested and adjusted for accuracy as populations change. The ruling also repeatedly stressed that a risk score cannot be the “determinative factor” in deciding whether someone gets incarcerated or gets probation instead.

Christopher Slobogin, director of the criminal justice program at Vanderbilt Law School, called the decision “one of the most sophisticated judicial treatments of risk assessment instruments to date” for its detailed analysis of risk assessment tools and its acknowledgment of all of their various limitations.

Still, it’s not clear what effect the ruling will have on sentencing judges’ decision-making.

“The court says that COMPAS may not be determinative in increasing sentence severity or whether an offender is incarcerated,” said Slobogin. “But of course, all things being equal, a high risk score will make it much less likely a person will get the minimum sentence or avoid incarceration.” In other words, the opinion mandates warnings and instructions that might, in reality, be hard for judges to actually follow.

Loomis’s attorney, and the Wisconsin Attorney General’s office, which represented the state, both declined to comment on the ruling.

Latest Stories from ProPublica

Current site Current page