Marketplace®

Daily business news and economic stories
Dec 28, 2018

Are computers racist? No, but people still are. (Replay)

Facial recognition software has made huge advancements in accuracy, but it has a long way to go — specifically when it comes to recognizing people of color. Commercially available software can tell the gender of a person using a photograph. According to researcher Joy Buolamwini, of the MIT Media Lab, that software is correct 99 percent of the time when it’s looking at a white male but is less than half as accurate when looking at a darker-skinned female. Marketplace Tech host Molly Wood spoke with Buolamwini about her research and the human biases that creep into machine learning. (This interview originally aired Feb. 13.)

Passersby walk under a surveillance camera that is part of a facial recognition technology test at Berlin's Suedkreuz train station in 2017.
Passersby walk under a surveillance camera that is part of a facial recognition technology test at Berlin's Suedkreuz train station in 2017.
Steffi Loos/Getty Images

Subscribe:

Segments From This Episode

Facial recognition software has made huge advancements in accuracy, but it has a long way to go — specifically when it comes to recognizing people of color. Commercially available software can tell the gender of a person using a photograph. According to researcher Joy Buolamwini, of the MIT Media Lab, that software is correct 99 percent of the time when it’s looking at a white male but is less than half as accurate when looking at a darker-skinned female. Marketplace Tech host Molly Wood spoke with Buolamwini about her research and the human biases that creep into machine learning. (This interview originally aired Feb. 13.)