Support our non-partisan non-profit newsroom 💜 Donate now
Marketplace Tech Blogs

Why facial recognition software has trouble recognizing people of color

Molly Wood, Stephanie Hughes, and Shaheen Ainpour Feb 13, 2018
HTML EMBED:
COPY
Passersby walk under a surveillance camera that is part of a facial recognition technology test at Berlin's Suedkreuz train station in 2017. Steffi Loos/Getty Images
Marketplace Tech Blogs

Why facial recognition software has trouble recognizing people of color

Molly Wood, Stephanie Hughes, and Shaheen Ainpour Feb 13, 2018
Passersby walk under a surveillance camera that is part of a facial recognition technology test at Berlin's Suedkreuz train station in 2017. Steffi Loos/Getty Images
HTML EMBED:
COPY

Facial recognition software has made huge advancements in accuracy, but it has a long way to go — specifically when it comes to recognizing women and people of color. Commercially available software can tell the gender of a person using a photograph. But according to a new study from Joy Buolamwini, of the MIT Media Lab, that software is correct 99 percent of the time when it’s looking at a white male and less than half as accurate when looking at a darker-skinned female. Marketplace Tech host Molly Wood spoke with Buolamwini about her research and the human biases that creep into machine learning. The following is an edited transcript of their conversation. 

Joy Buolamwini: If you look at existing benchmarks, you’ll notice that even these benchmarks have major skews. Why this matters is when breakthroughs were being recorded for facial recognition in 2014, the way they were able to say they had made progress was by saying they achieved 97.35 percent accuracy on this data set. But that data set is 77 percent male and 83 percent white. Yes, there is the bias that can be in the training data, but there’s also bias that can be in the benchmark data sets, which are used to validate whether or not something is ready.

Molly Wood: Does it cast doubt on artificial intelligence overall? You know, I mean, I think there’s a lot of this that’s being touted as sort of the next “holy grail.” Do you think that the results of your study say, “Hey we need to take a second look at almost all of this technology that’s being peddled as our saving grace?”

Buolamwini: Absolutely, because what we have right now is blind faith in AI that doesn’t acknowledge how easy it is for bias to creep into the systems. And at the end of the day data reflects our history, and our history has been very biased to date. So we have to be checking, and this is especially important because now we’re using artificial intelligence in high stakes decision making — deciding if somebody is hired, if somebody is granted a loan or not, it’s influencing college admissions choices.

Wood: So what are you most concerned about?

Buolamwini: I’m most concerned about the promise for precision health. Precisely who will precision health care benefit? You’re using data. You’re using clinical data oftentimes. It wasn’t until 1993 that women were even mandated to be part of government-sponsored clinical trials. 1993.

Wood: Wow.

Buolamwini: Right? Even then the information wasn’t always aggregated by sex, even though there are biological differences that are really important to make sure we address when we’re looking at the efficacy of drugs or various treatments.

Wood: Do you envision that there could be some sort of a technical standard that says, “You can’t release an AI product until you have met these benchmarks”? Until you can ensure in some sort of open-source way — this is a pipe dream, I know — but ensure that you’ve taken into account intersectionality. I mean, could this become like USB? Like a standard? 

Buolamwini: Yes, actually this is not a pipe dream. I am currently leading such a standard with the [Institute of Electrical and Electronics Engineers]. We’re increasingly adopting this kind of technology in ways that impact people’s lives. And so for it to be adopted, certain standards have to be met. If a police department is going to, let’s say, put facial recognition on body cameras, does the community know? Have they had the option to say yes? When should this technology not be used? What are the necessary steps for consent? So I don’t think it’s a pipe dream, which is why I’m working and putting these standards into place. Data reflects our history, and our history has been very biased to date. 

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.