A statistical analysis from ProPublica out this week details how a sentencing algorithm that’s being used in the administration of justice appears to be biased along racial lines. Julia Angwin is a senior reporter for ProPublica and worked on this.
“We looked at risk assessment algorithms used throughout the criminal justice system, and they are often questionnaires about an individual defendant,” said Angwin. “They ask about your previous criminal history, your family, and make an assessment of low, medium, or high of whether you would go on to commit a future crime.”
ProPublica’s analysis found that an algorithm created by the for-profit company Northpointe was only 61 percent accurate in predicting future crime. The analysis also found that the algorithm was twice as likely to give black people a high risk score incorrectly.
Listen to the full interview with Julia Angwin above.
If you’re a member of your local public radio station, we thank you — because your support helps those stations keep programs like Marketplace on the air. But for Marketplace to continue to grow, we need additional investment from those who care most about what we do: superfans like you.
Your donation — as little as $5 — helps us create more content that matters to you and your community, and to reach more people where they are – whether that’s radio, podcasts or online.
When you contribute directly to Marketplace, you become a partner in that mission: someone who understands that when we all get smarter, everybody wins.