Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace

Back to the budget brink

Jul 18, 2019

Latest Episodes

Download
HTML Embed
HTML EMBED
Click to Copy
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy

How an algorithm is taught to be prejudiced

Ben Johnson and Aparna Alluri Feb 3, 2015
Share Now on:
HTML EMBED:
COPY

Algorithms are everywhere. They are what advertisers use to target users online and what search engines use to cough up all those results in a particular order. Even the data collected by governments is used to build algorithms that are then used to track, flag or analyze whatever the government is looking to track, flag or analyze.

But there’s a growing fear that these algorithms are learning stereotypes, and therefore abetting data discrimination. Some algorithms, for instance, make an assumption about an individual’s ability to pay debt based on race. Basically, a lot of data goes into these “black box algorithms,” as they are known, and they produce results that are often discriminatory.

“I call it a black box because we don’t have access to these sorts of algorithms,” says Frank Pasquale, a University of Maryland professor of law. He explores the subject in his new book, “The Black Box Society: The Secret Algorithms That Control Money and Information.”

The algorithms produce results based solely on the data that was fed to them, but the trouble is no one knows exactly how the algorithm is crunching that data. Yes, algorithms are racist, Pasquale says, but they are also “reflecting the preferences of thousands and possibly millions of users.”

He sees this as a problem because it’s likely to influence even those who don’t buy into such stereotypes. And they may start thinking like the algorithm. He recommends something akin to “an anti-discrimination type of approach.”

If it’s true that we can never know how these algorithms work, then we must not allow certain results, he says. “We need to move beyond saying we just reflect what people think,” Pasquale says, “and make them [algorithms] more progressive.”

If you’re a member of your local public radio station, we thank you — because your support helps those stations keep programs like Marketplace on the air.  But for Marketplace to continue to grow, we need additional investment from those who care most about what we do: superfans like you.

Your donation — as little as $5 — helps us create more content that matters to you and your community, and to reach more people where they are – whether that’s radio, podcasts or online.

When you contribute directly to Marketplace, you become a partner in that mission: someone who understands that when we all get smarter, everybody wins.