Support our non-partisan non-profit newsroom 💜 Donate now

Full interview: ProPublica’s Julia Angwin on biased sentencing algorithms

Ben Johnson May 25, 2016
HTML EMBED:
COPY
An algorithm created by the for-profit company Northpointe to predict future crime was only 61 percent accurate, according to a ProPublica analysis. ROBYN BECK/AFP/Getty Images

Full interview: ProPublica’s Julia Angwin on biased sentencing algorithms

Ben Johnson May 25, 2016
An algorithm created by the for-profit company Northpointe to predict future crime was only 61 percent accurate, according to a ProPublica analysis. ROBYN BECK/AFP/Getty Images
HTML EMBED:
COPY

statistical analysis from ProPublica out this week details how a sentencing algorithm that’s being used in the administration of justice appears to be biased along racial lines. Julia Angwin is a senior reporter for ProPublica and worked on this.  

“We looked at risk assessment algorithms used throughout the criminal justice system, and they are often questionnaires about an individual defendant,” said Angwin. “They ask about your previous criminal history, your family, and make an assessment of low, medium, or high of whether you would go on to commit a future crime.” 

ProPublica’s analysis found that an algorithm created by the for-profit company Northpointe was only 61 percent accurate in predicting future crime. The analysis also found that the algorithm was twice as likely to give black people a high risk score incorrectly.

 Listen to the full interview with Julia Angwin above.

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.