❗Help close the gap: We still need to raise $40,000 by the end of March. Donate now
Tech companies scrap facial recognition products
Jun 12, 2020

Tech companies scrap facial recognition products

HTML EMBED:
COPY
After largely ignoring demands from civil rights groups, IBM, Amazon and Microsoft have put moratoriums on sales to law enforcement as protests against brutality continue.

The House of Representatives introduced a police reform bill this week that set limits on how police can use facial recognition technology, including a ban on real-time facial recognition in body cameras.

Meanwhile, companies including IBM, Amazon and Microsoft said they either won’t sell certain types of facial recognition tech to police or will stop selling it until federal regulations are in place.

Let’s dig into this in Quality Assurance, where we take a deeper look at a big tech story. I spoke with Sidney Fussell, senior staff writer covering surveillance for Wired. He said the bill is trying to slow things down. The following is an edited transcript of our conversation.

Sidney Fussell: The word “automated” is used a lot in the language of the bill. And that’s something that people need to be really concerned about. Like, [facial recognition technology] automates policing. So it’s much faster and it removes that friction of consent of stopping the person, of probable cause of reasons to suspect the crime.

Molly Wood: This seems like a good place to point out, though, that facial recognition, particularly with Black and brown faces, also has real accuracy concerns. So it’s not just the use and the invasiveness, it’s also the potential for false positives?

Fussell: Yes, absolutely. And there’s a lot to be said about the lack of accuracy in facial recognition on dark-skinned faces. But at the same time, we can have this conversation around accuracy, but that sort of presumes that we want to use it. So we need to realize that these are different schools of thought. [Facial recognition technology] doesn’t work accurately, and we need to fix that. Or it works disproportionately bad on this skin tone versus that skin tone is one conversation. And we should not use this because it’s invasive. It’s unfair, it doesn’t allow for consent. That’s another conversation. And often I’ve seen these two things conflated.

Wood: When we talk about incentives on the other side, how much money is at stake here? How much money could this tech potentially make for the companies, the smaller ones and the big ones?

Fussell: We don’t actually know. There’s a huge lack of transparency about how much money is exchanging hands. And not to derail the conversation, but we’re having this conversation about defending the police and a big part of that right now is about how much money we spend on policing and police technologies and could that money be used elsewhere? A lot of that is about or relates to surveillance technologies. Why are we spending money on this and not on other things? And that’s why I think the conversation needs to be had around, maybe not just, is the technology accurate or not accurate? But maybe also, do we want this, do we not want this [technology], represented through dollars? If we are really going to have this conversation about the importance of using these technologies, we should tie it to actual metrics — be it reduction in crime, be it different rates of people going to prison. We need to come up with actual goals, not just some hazy sort of [idea], like it keeps us safe. We need to be much more specific than that. And I think one way to do that is by looking at how much money we’re spending.

Related links: More insight from Molly Wood

Some privacy experts say the House bill still needs to be a little more clear on how facial recognition tech is used by police. The current language mostly restricts facial recognition from being embedded into body cams and used in real time, as we mentioned. But the American Civil Liberties Union and others want to make sure that footage isn’t used after the fact or used to surveil or arrest protesters.

In case you are wondering about Clearview AI at this juncture — that’s the startup that scraped billions of pictures of people from public websites to create a huge database and build a facial recognition product on top of it. A product it marketed to foreign governments, including repressive regimes like Saudi Arabia, and sold to police agencies around the country, including Customs and Border Protection, which it pitched as a way to track people who might be infected with the novel coronavirus. The company that’s been sued over and over and gotten cease-and-desist orders from Twitter, Google, YouTube and Microsoft. And had its entire customer list stolen in a giant data breach back in February. You know — that company?

That company says it plans to continue selling its product to police and its CEO says don’t worry though — its tech has no racial bias.

Oh, speaking of that company, the European Union’s Data Protection Board said Clearview AI better not plan on doing business in the EU because it would almost certainly be illegal.

In now-classic Clearview AI fashion, the company’s CEO essentially responded … nah.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer