Marketplace®

Daily business news and economic stories

Full interview: ProPublica’s Julia Angwin on biased sentencing algorithms

Looking at the racial bias of algorithms.

An algorithm created by the for-profit company Northpointe to predict future crime was only 61 percent accurate, according to a ProPublica analysis.
An algorithm created by the for-profit company Northpointe to predict future crime was only 61 percent accurate, according to a ProPublica analysis.
ROBYN BECK/AFP/Getty Images

statistical analysis from ProPublica out this week details how a sentencing algorithm that’s being used in the administration of justice appears to be biased along racial lines. Julia Angwin is a senior reporter for ProPublica and worked on this.  

“We looked at risk assessment algorithms used throughout the criminal justice system, and they are often questionnaires about an individual defendant,” said Angwin. “They ask about your previous criminal history, your family, and make an assessment of low, medium, or high of whether you would go on to commit a future crime.” 

ProPublica’s analysis found that an algorithm created by the for-profit company Northpointe was only 61 percent accurate in predicting future crime. The analysis also found that the algorithm was twice as likely to give black people a high risk score incorrectly.

 Listen to the full interview with Julia Angwin above.

Related Topics