Listen Live
Stone Soul Vendor Graphics
Praise 104.7 Featured Video
CLOSE

In many courtrooms across the nation, those caught in the criminal justice system are given a score based on a computer risk assessment of their likelihood of reoffending. These scores are informing decisions about sentences, probation, and bond amounts, according to ProPublica.

The public interest news organization analyzed the risk scores of more than 7,000 people arrested in Broward County, Florida from 2013 to 2014—which it says is the same measure software developers use.

“The score proved remarkably unreliable in forecasting violent crime,” ProPublica stated.

The analysis found the algorithm was wrong in 80 percent of cases at predicting who would commit a violent act. It was “somewhat more accurate than a coin flip” when predicting who would reoffend when including those who committed misdemeanors and felonies.

ProPublica also uncovered racial bias: The software consistently misevaluated Black defendants as high risk and White defendants as low risk.

The Justice Department’s National Institute of Corrections supports the use of these computerized risk assessments. Yet while in office, former U.S. Attorney General Eric Holder called for further study, ProPublica said.

Holder issued this warning, according to the site:

“Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice. They may exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society.”

ProPublica said Northpointe, the company that developed the algorithm used in Broward County and other locations, rejected its analysis. The company does not share how it calculates risk scores, but it gave ProPublica “the basics of its future crime formula.”

Northpointe denies taking race into account. Instead, the company said its algorithm looks at multiple factors, including the defendant’s education level and whether a parent was incarcerated.

This secrecy, says ProPublica, makes it impossible to know what’s causing the racial disparities in the scores.

SOURCE: ProPublica | PHOTO CREDIT: Getty 

SEE ALSO:

Senators Introduce Reworked Criminal Justice Reform Bill, But Does It Go Far Enough?

Three Ways Obama’s Criminal Justice Reforms Will Help Black Families

Analysis: Software Used In Criminal Justice System Bias Against Blacks  was originally published on newsone.com