Help power Marketplace this winter when you support the show today. Donate Now!
Facial recognition part of Israel’s arsenal in Gaza war
Apr 8, 2024

Facial recognition part of Israel’s arsenal in Gaza war

HTML EMBED:
COPY
Israel is using the technology, despite its high error rate, for monitoring Palestinians and identifying potential enemies, according to Sheera Frenkel of The New York Times.

It’s been six months of war in the Gaza Strip since Hamas attacked Israel on Oct. 7. The destruction and death have been profound, and nearly every aspect of life in the roughly 140-square-mile territory has been upended.

The New York Times recently reported that the Israeli military is using facial recognition artificial intelligence to monitor Palestinians in Gaza. The government hasn’t publicly acknowledged it, but reporter Sheera Frenkel spoke to Israeli intelligence officers, military officials and soldiers who confirmed that the technology is being used for mass surveillance.

Marketplace’s Meghan McCarty Carino spoke with Frenkel about its role in the conflict, starting with the story of a Palestinian poet, Mosab Abu Toha, who reportedly was arrested and beaten by Israeli forces.

The following is an edited transcript of their conversation.

Sheera Frenkel: Mosab’s experience seemed unfortunately quite standard for how Palestinians are suddenly finding themselves under the lens, really, of this facial recognition program. He was trying to flee with his family. His home is originally in the northernmost part of the Gaza Strip, and he was trying to leave Gaza through the southernmost part, Rafah, where if you have permission you can escape to Egypt. He was forced to pass through what was essentially a checkpoint created by Israel. They had closed off other road systems and said this particular road is safe. And they had positioned their military vehicles in such a way that you had to walk almost single file. And when they did that, they asked them to turn and face the tanks. You know, Mosab thought, oh, they’re just looking for specific people. He thought maybe they had, Palestinians call it, a collaborator. What he means is they had somebody with them, another Palestinian, who was going to tell them, oh, that person’s part of Hamas so that person is, you know, someone you might want to question. But actually, what they were doing was running facial recognition on every single Palestinian who was walking single file down that road and trying to figure out if they were on this database, this list of people that the Israeli army was searching for.

Meghan McCarty Carino: And he was eventually told that his detainment was a mistake. In practice, what kinds of problems come up with using this technology in that way?

Frenkel: Well, in Mosab’s specific case, the mistake was actually in the intelligence that had been gathered that suggested that he was part of Hamas. According to the many Israeli intelligence officials I spoke to, there was nothing confirming his involvement in Hamas beyond being named in one single interrogation. But there are a number of mistakes that can happen along the way, as we know, from facial recognition software that’s been deployed here in the United States. It’s been deployed in a number of countries, it has a high error rate, and depending on the program you’re using, that error rate can be anywhere between 1% and 10%. It also seems to not do as well with faces that come from more ethnically diverse backgrounds. Specifically here in America, we’ve seen that Black Americans are often misidentified. In other countries, it’s often minority groups that the system struggles to identify. And so it’s not great at often doing the thing that these governments are using it for.

McCarty Carino: What do you know about the specific facial recognition system that is being used and what kinds of images it’s relying on?

Frenkel: In Israel, they’re using a couple different programs, and one is this Israeli company called Corsight. They have partnered with the Israeli army, and as we understand it, they’ve really sort of created a customized way for Israel to upload a database of images and quickly use AI to scan that database. If you go online, you can sort of read Coresight’s executives boasting about how their technology is so good that it can be used to reveal faces even if part of the face is obscured or if a person is wearing a mask. We also know that they actually are using Google Photos. They found that Google Photos’ ability to identify faces was extremely accurate. For instance, on my phone, I have the Google Photos app. I might put up all my photos there and identify one photo as my daughter. And then I’ll tell Google Photos, hey, you know, please find other photos of my daughter. They’re able to do that, they’re able to look through all my images and find her in all these images, even if only one part of her face is in the actual photo. So they can do that with any database they’re given, and we know that this is how Israel is essentially doing it. They’re putting up a database of people saying this person is named such and such, please find me other photos of that person.

McCarty Carino: I mean, does that kind of use case have even more potential for bias?

Frenkel: I’m not sure. I mean, it could. I would say that it relies very heavily on the Israelis being certain on their intelligence, right? It relies very heavily on the Israelis being certain that a specific person is who they think they are and in having good intelligence for that specific person being part of Hamas or being a part of another militant group in the Gaza Strip.

McCarty Carino: This was kind of an incredible detail that one system was not found to be reliable enough so the military is also kind of cross-referencing it with a consumer photo app that is obviously not intended to be used in that way.

Frenkel: Right. I mean, I thought that was interesting as well. And ultimately, you know, it’s not as though Google built this program for Israel. They are using the off-the-shelf consumer version that any of us would use. I asked repeatedly, if Corsight is not always the most accurate, why are you still using it? And the answer I got back was like, oh, well, it’s good enough. And given how many Palestinians are being detained in Gaza, the idea that it would be good enough, you’d think you’d want, you know, the greatest accuracy possible. You’d want systems that could check one another. But, you know, we know that that is not what’s happening there. What’s happening there is that they are using this as, really, what was described to me as broadly as possible.

McCarty Carino: Israel had already been using a facial recognition system in the West Bank and East Jerusalem, right?

Frenkel: That’s correct. Israel began deploying facial recognition first in West Bank cities and then in East Jerusalem. And it’s used in a much more regular sort of permanent way there. In the many checkpoints that dot the West Bank between Palestinian cities and the rest of that territory, there are fixed cameras. And it’s known to Palestinians, well, when I pass through this checkpoint, this camera is going to scan my face. It’s going to check if I’m wanted, it’s going to check if I’m on a list of, you know, people that Israel is looking for. And so that system has been in play for a few years now.

McCarty Carino: What is unique about how it’s being used in Gaza?

Frenkel: It’s not done with consent. The Palestinians who are walking in these areas are doing so because they’ve been told this is a humanitarian corridor, this is a safe road to walk down. And actually, it’s being designated as the only safe road to walk down if you’re trying to reach one part of the Gaza Strip from the other. There is no process in which anyone is telling them, you know, as you walk down this road, we are scanning your faces, we are collecting facial recognition data on you. It’s also being done in an interesting way in that it’s being done very, very quickly. The experts I spoke to said that the fact that Israel could scan their faces and then within 10 minutes or so, under 10 minutes, have their IDs, have their full names, have quite a thorough profile on them, is very unique.

McCarty Carino: Israel is not the first country to sort of use facial recognition kind of to surveil a population like this. Can you tell me more about some of the other examples that you noted?

Frenkel: Well, my colleague Paul Mozur in 2019 wrote about China’s use of facial recognition. In that case, they were using facial recognition to seek out a minority Muslim population that lives in China, the Uyghurs, and the police were essentially using it to profile and just search for them. And as we know, China has been trying to suppress that population for some time. We also know that Russia and Ukraine more recently have been using it. We’ve heard the Ukrainian government talk about using facial recognition to identify Russian soldiers who have been killed in Ukraine. And that’s actually part of their PR campaign, to let the Russian public know, hey, your soldiers are dying here. We have the names, we know who they are. And even as the Russian government tries to potentially hide how many Russian soldiers have died in Ukraine, Ukraine has been trying to identify soldiers on its own.

McCarty Carino: What are the implications of this technology sort of increasingly being used in contexts like this?

Frenkel: I think that as more countries, as more governments are willing to implement facial recognition technology, it really becomes more ubiquitous. And I would actually say my colleague Kashmir Hill wrote a really wonderful book about the expanding use of facial recognition technology. It’s called “Your Face Belongs to Us.” And it really looks at how much this industry has grown and how AI specifically makes this technology so much faster and so much more expansive than it’s ever been.

More on this

You can read more about Israel’s previous use of the technology in the West Bank and East Jerusalem as well as a report from Amnesty International on the problems with facial recognition surveillance. The human rights group has been tracking instances of what it sees as troubling applications of these tools in places like Israel, India and the United States.

Of course, one of the big problems with these AI systems is their error rate, as Sheera Frenkel pointed out. Police departments in the U.S. that use these systems have made a growing number of alleged wrongful arrests.

On the show last year, we spoke to an attorney who is representing a Detroit man he said was misidentified as a suspect by facial recognition AI. At the time he told us there were at least five known cases of police misidentification involving this technology and that all of the misidentified men were Black.

One of the big facial recognition companies is Clearview AI. Last year, “Marketplace” host Kai Ryssdal talked to The New York Times’ Kashmir Hill about her book on Clearview, which Frenkel referenced, called “Your Face Belongs to Us.”

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer