Full interview: ProPublica’s Julia Angwin on biased sentencing algorithms
Share Now on:
A statistical analysis from ProPublica out this week details how a sentencing algorithm that’s being used in the administration of justice appears to be biased along racial lines. Julia Angwin is a senior reporter for ProPublica and worked on this.
“We looked at risk assessment algorithms used throughout the criminal justice system, and they are often questionnaires about an individual defendant,” said Angwin. “They ask about your previous criminal history, your family, and make an assessment of low, medium, or high of whether you would go on to commit a future crime.”
ProPublica’s analysis found that an algorithm created by the for-profit company Northpointe was only 61 percent accurate in predicting future crime. The analysis also found that the algorithm was twice as likely to give black people a high risk score incorrectly.
Listen to the full interview with Julia Angwin above.
We’re here to help you navigate this changed world and economy.
Our mission at Marketplace is to raise the economic intelligence of the country. It’s a tough task, but it’s never been more important.
In the past year, we’ve seen record unemployment, stimulus bills, and reddit users influencing the stock market. Marketplace helps you understand it all, will fact-based, approachable, and unbiased reporting.
Generous support from listeners and readers is what powers our nonprofit news—and your donation today will help provide this essential service. For just $5/month, you can sustain independent journalism that keeps you and thousands of others informed.
Give today and get our limited edition tote.