❗Help close the gap: We still need to raise $40,000 by the end of March. Donate now
Algorithms are often biased. What if tech firms were held responsible?
Oct 18, 2021

Algorithms are often biased. What if tech firms were held responsible?

HTML EMBED:
COPY
Safiya Noble proposes solutions like awareness campaigns and digital amnesty legislation to combat the harms perpetuated by algorithmic bias.

We talk a lot on this show about the unintended consequences of using technology, and who is harmed by that technology.

Another person who thinks about this a lot is Safiya Noble. She’s an associate professor of gender studies and African American studies at UCLA. She’s also the author of the book “Algorithms of Oppression: How Search Engines Reinforce Racism” and a 2021 John D. and Catherine T. MacArthur Foundation fellow. The following is an edited transcript of our conversation.

Safiya Noble: Well, I started my work really looking closely at very banal, everyday kinds of technologies like search engines, because I noticed that everyone was starting to use search engines as a replacement for other kinds of information resources like libraries, or teachers, or professors, or parents. And so I thought, well, let me look closely and see what is Google doing when it comes to how it represents ideas about vulnerable people. And that’s where I found that for many years when you did Google searches on Black girls, Latina girls, Asian girls, you found pornography, hypersexualized content, really misrepresentative ideas about who girls of color were. And of course, over the years, that’s really become a place where disinformation about everything from political candidates to anti-vax, anti-science, propaganda circulates.

Marielle Segarra: You talk about restoration and repair. What would that mean for tech companies?

Safiya Noble smiles as she poses in front of several houses.
Safiya Noble (courtesy John D. and Catherine T. MacArthur Foundation).

Noble: I think that they have extracted so much from our public and they’ve threatened democracies around the world that they owe the public restoration. There could be kind of cleanup funds the way Exxon or other pipeline companies have to pay for cleanup. I think we should be thinking about those kinds of models here in the tech sector, too.

Segarra: Do you have a vision of how that might work and who might receive the money?

Noble: Public awareness campaigns and education campaigns that the public needs, but also legal fees for people who are trying to go through the courts, just trying to get terrible information about themselves off the internet. It could certainly help us fund things like new laws. One of the most important places is around voter enfranchisement, because the internet has really been used so powerfully and weaponized to disenfranchise poor people and Black and Latino voters. Digital amnesty is the kind of legislation that I’d like to see where everything you’ve ever done on the internet up to, I don’t know, age 30 — maybe the time when your brain is fully developed — can be wiped off the internet. It doesn’t have to follow you, much like we think about juvenile court records being sealed. Those are just a handful of ideas about where funds could go to help restore communities and people and democracy.

Segarra: And will the MacArthur award accelerate any of the projects you’re working on?

Noble: Well, the MacArthur award is just an unexpected embarrassment of riches that I am so grateful for. And what I think it’s done is it’s really helped me, it will help my family. And I hope that it will help me outside of the university stand up a nonprofit called The Equity Engine that will really help other women of color, who like me, I think, see some harmful things and want to get involved and are already having impact, but don’t have a lot of support. I remember the early years of my career where I was not supported by big foundations, and I didn’t get a lot of attention around my work, and it felt like kind of pushing a boulder up a mountain. A lot of the things that I talked about now are very mainstream. People don’t even know that I was part of the early group of people who were talking about it, but I hope that Equity Engine will support other women of color and Black women to do our work. I hope that it will attract resources from people who are interested in supporting women of color in our work.

Related Links: More insight from Marielle Segarra

Noble used to work in advertising. She came on the show in 2018 and talked to Molly Wood about that. Here’s a link to that episode and a video where she stresses that we should start thinking about the companies that make search engines, like Google, for what they are — ad companies. And that we should scrutinize them, and not just fall into the habit of thinking, well, this is a trusted public good.

2021 MacArthur fellow Safiya Noble discusses holding tech companies accountable for algorithm biases.

Noble has been doing this work for a long time. She went on an American Civil Liberties Union podcast to talk about it and said at least now a lot more people understand algorithmic bias. Ten years ago, professors with Ph.D.s would tell her computer code can’t discriminate. It’s just math, and math can’t be racist or sexist. But one example after another has shown the opposite.

In her book, she says that at one point, if you searched for the phrase “professional hairstyles for work” on Google, you’d get almost only pictures of white women. If you searched for unprofessional hairstyles, you’d get pictures of Black women with natural hair.

And we’ve been talking mainly about search engines, but biased algorithms are everywhere.

A couple years ago Amazon scrapped a recruiting tool that was screening out female job candidates, basically downvoting any resumes that included the word “women’s.” Another example: Researchers found that an algorithm used by many health care systems to decide when a patient should get more complex medical attention was biased against Black patients. And a federal study found that facial recognition software is much more likely to misidentify people of color and women, which can result in false arrests or lengthy interrogations of the wrong person.

What’s going to force change? Sometimes it’s public backlash. And in the future it could be actions by the federal government. Earlier this year, the Federal Trade Commission said there are laws it can use to hold companies accountable for racist or biased algorithms, like the Fair Credit Reporting Act, if an algorithm is used to deny someone a job, housing or credit.

The FTC also said that jargon about algorithms can make artificial intelligence seem “magical.” But really, it’s not. And there’s plenty that companies can do to watch out for bias, starting with a solid and representative dataset and testing their algorithms to see if they discriminate.

You know what else plays a role? The people who design the algorithms. Studies have shown that if you have a more diverse group of engineers, that can lead to a less biased product.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer