Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report

Is Georgia on national Democrats' minds?

Nov 20, 2019

Latest Episodes

Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report

Toy tariff story

Nov 20, 2019
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Download
HTML Embed
HTML EMBED
Click to Copy
Download
HTML Embed
HTML EMBED
Click to Copy
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace

How an algorithm is taught to be prejudiced

Ben Johnson and Aparna Alluri Feb 3, 2015
Share Now on:
HTML EMBED:
COPY

Algorithms are everywhere. They are what advertisers use to target users online and what search engines use to cough up all those results in a particular order. Even the data collected by governments is used to build algorithms that are then used to track, flag or analyze whatever the government is looking to track, flag or analyze.

But there’s a growing fear that these algorithms are learning stereotypes, and therefore abetting data discrimination. Some algorithms, for instance, make an assumption about an individual’s ability to pay debt based on race. Basically, a lot of data goes into these “black box algorithms,” as they are known, and they produce results that are often discriminatory.

“I call it a black box because we don’t have access to these sorts of algorithms,” says Frank Pasquale, a University of Maryland professor of law. He explores the subject in his new book, “The Black Box Society: The Secret Algorithms That Control Money and Information.”

The algorithms produce results based solely on the data that was fed to them, but the trouble is no one knows exactly how the algorithm is crunching that data. Yes, algorithms are racist, Pasquale says, but they are also “reflecting the preferences of thousands and possibly millions of users.”

He sees this as a problem because it’s likely to influence even those who don’t buy into such stereotypes. And they may start thinking like the algorithm. He recommends something akin to “an anti-discrimination type of approach.”

If it’s true that we can never know how these algorithms work, then we must not allow certain results, he says. “We need to move beyond saying we just reflect what people think,” Pasquale says, “and make them [algorithms] more progressive.”

Fall of the Berlin Wall
Fall of the Berlin Wall
The financial lessons of Germany's reunification 30 years ago.  
Check Your Balance ™️
Check Your Balance ™️
Personal finance from Marketplace. Where the economy, your personal life and money meet.
How We Survive
How We Survive
Climate change is here. Experts say we need to adapt. This series explores the role of technology in helping humanity weather the changes ahead.