❗Help close the gap: We still need to raise $40,000 by the end of March. Donate now
How constant surveillance puts protesters at risk
Sep 18, 2020

How constant surveillance puts protesters at risk

HTML EMBED:
COPY
After-the-fact arrests can have a chilling effect on free speech and lead to cases of mistaken identity.

As Black Lives Matter protests continue around the country, police are using facial recognition and all kinds of other technology to arrest protesters and organizers. While, in some cases, the people arrested did commit crimes, after-the-fact arrests can have a chilling effect on free speech and lead to cases of mistaken identity.

They also show us just how much surveillance is part of our lives. Simone Browne is a professor at The University of Texas at Austin. She’s also author of the book “Dark Matters: On the Surveillance of Blackness.”  

She told me about how police identified and arrested a protester in Philadelphia. The following is an edited transcript of our conversation.

Simone Browne: There was a tattoo on her arm and she was wearing a T shirt that was quite unique. It said, “Keep the immigrants, deport the racists.” And so, the police used the images of her. They went to find out, where did she get the shirt made? And they found her comment that she made on Etsy. They looked at her Instagram, they looked at her LinkedIn profiles, and they were able to match her, this image to identify her, and she was eventually charged. All this is to say that there are still the kind of trails of data that we leave about ourselves that is being used to form a case.

Molly Wood: To what extent, to your knowledge, is some of this technology being used to find and arrest protesters and even protest organizers?

Browne: I think the chilling effect that organizers, but also the ACLU, became quite aware of and worked to challenge was around 2015, 2016 or so. This company, Geofeedia, which is a company that’s a kind of social media analysis company, that was working hand in hand with various policing agencies to monitor key words: Black Lives Matter, protests, jihad. All of these things were then tagged and flagged. [In some cases] you’d have a policing agency visit a potential protester. And of course, if you go on to Geofeedia’s website now, there’s nothing really, just contact information. But we know if something is out there, if Geofeedia’s gone, another company will pop up and fill in that gap.

Wood: When you layer on all of this technology that you’ve described, it sounds like it could be relatively accurate. And I could see police departments falling into the idea that, although there have been concerns that facial recognition isn’t always accurate, once you add in social media, this should work great. What is the pushback to that?

Browne: The idea of something working great, if just one person is wrongly identified, say for example, with facial recognition technology, then it’s not working at all. These technologies rely on this idea that they are perfect, correct, but they really aren’t. And so people are asking for a pause, because these technologies are not outside of the system in which we live in, where black people are criminalized.

Simone Browne (Courtesy UT Austin)

Wood: How do you feel the long-term implications of this surveillance might play out? Will people be less willing to take the risk of exercising their right to protest?

Browne: I don’t think so. We’re in the middle of a pandemic and yet people are still risking a lot to go out and protest in demand something better. I think one case, and I’ll give you an example that I think is important, is in terms of DNA collection. A lot of people want, they’re armchair genealogists or also want to find family or some type of connection, and they use a company like 23andMe or Ancestry.com, or GEDmatch. And that same company, GEDmatch, was recently purchased just last year by a company that has close ties to a policing agency. And this company is not just about finding long-lost relatives, but they save they’re primarily for forensic analysis. Whether it’s Etsy or whether it’s Ancestry.com, we have to really think about what happens to that data.

Wood: It feels like it’s a thing that privacy researchers have warned about for a long time, that there is essentially a big web of surveillance and we’re leaving tracks all the time. And that it isn’t always obvious what the harm might be, until something like this happens.

Browne: Exactly. One of the places that I look to see what’s the future perhaps, or the future that’s already here, is looking at airport security. And there has been a lot of push in terms of AI-enabled technologies to assess risk, to assess threat. And one of the things that a few companies are starting to develop now is emotion recognition. A traveler might present themselves at an airport, speak to an avatar — at one company “avatar” actually stands for automated virtual agent for truth assessments — and this avatar will then ask them a series of questions. And then measurements are taken by the changes in their voice, by heat or sweat or any type of what might be termed a micro-expression of guilt, like that your heart rate increasing, those types of things. And then [it would] assign a certain threat category to see if that person might be a threat to airport security. I don’t necessarily know if these types of technologies are being used to monitor protest activists and other moments of rebellion, but it is something to look out for.

Wood: (Pauses) That pause was, yeah, that’s terrifying. Is there ear recognition? Is that a thing?

Browne: Yes. There’s recognition of everything. And it’s almost like throwing something to the wall and seeing what hits. The ear, it’s a relatively stable part of the body. And that has been known since Alphonse Bertillon, who is said to be the father of forensic sciences. [He] was using that in the 1800s as a stable way of recognizing or identifying the human body, to catalog them. And so there are researchers that are working on every part and piece of the body that you could think imaginable as a way to try and shore up this idea, which is just an idea that the human body is stable, that the human can be categorized and identified. And we know that bodies don’t work that way. But the science does [believe that].

Wood: Your book is called “Dark Matters: On the Surveillance of Blackness.” We’ve been talking about this in the contest context of protest. Why is this surveillance of particular concern to Black Americans?

Browne: I say surveillance is the fact of anti-Blackness, not only in the U.S., but globally. And so, it’s been a concern here in the U.S. for centuries. We’re thinking about slave patrols, plantation control, all of these technologies that were put in place to deem Black people as outside of the right to have rights. But it’s also why I think it’s important to study the history of surveillance, within transatlantic slavery, within plantation slavery, because it also offers us moments of resistance and moments of rebellion and escape to something different, something that looks like freedom.

Related links: More insight from Molly Wood

Portland, the site of months of protest, last week passed the strictest law against facial recognition tech in the country. The ban says city agencies cannot use the technology in public places and neither can private companies or governments.

A story about it in Popular Mechanics notes that the ban means that police can’t use any form of facial recognition software on surveillance video they collect, but that the ban would also apply at, say, the Portland airport, where Delta is using facial recognition technology to check passengers in for flights. It would additionally apply to a chain of convenience stores in Portland that, apparently, was using video surveillance and facial recognition to remotely determine whether someone could enter a store. They were remotely locking and unlocking doors based on people’s faces.

The Portland ban would not apply to personal use, stories were quick to point out, so you can still use your face to unlock your iPhone. Assuming you still feel comfortable with that, I guess.

Now, on to another topic. We are working on a special show at “Marketplace Tech” about adapting to and surviving climate change and extreme weather. And we want to hear from you: What, if anything, have you done to adapt to climate change in your life? Did you install air conditioning? Maybe get backup power for your house? Go gray water? Did you get solar or put in new windows or buy an air purifier? Did you move?

Or do you feel stuck because you don’t have the money to do any of those things?

Please email us: mptech@marketplace.org. Send us voice memos if you can or just tell us your stories.

And if you’re in the smoke stream, a fire zone or a hurricane path, or hopefully none of those, I hope you have a safe and healthy weekend. 

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer