What Facebook’s $550M facial recognition settlement might mean for users
Share Now on:
About a decade ago, Facebook started automatically tagging people whose faces its algorithms recognized in our uploaded photos. It almost seemed like magic. This week, Facebook agreed to pay $550 million over claims that the tool violated privacy rights. The settlement was in Illinois, which has strict laws protecting biometric data.
The tech giant revealed the settlement agreement at the same time as its quarterly financial results this week, showing the company’s revenue was up 25% to $21 billion at the end of last year.
For today’s “Quality Assurance,” I spoke with Natasha Singer, a tech reporter for the New York Times, and she explained what this settlement means. The following is an edited transcript of our conversation.
Natasha Singer: Illinois has the toughest biometric privacy law in the United States — there are only three states that have these laws. Biometric privacy pertains to your fingerprints or facial scans for facial recognition, and in Illinois, the law says that companies can’t collect that information, like facial data for facial recognition, without getting written permission from someone. Also, they can’t collect the data without telling somebody how long they’re going to keep it. Facebook users in Illinois sued Facebook saying that it had collected their facial information and had not obtained their affirmative opt-in permission and had not told them how long Facebook was going to keep it. Facebook says these claims are not true, but Facebook said that it settled this lawsuit because it was going to drag on and take up too much resources and was better for everybody and the stockholders if they settled the lawsuit.
Jack Stewart: Do you think the size of this Facebook settlement might serve as a warning to other companies considering this technology? It feels like maybe when Facebook introduced things like facial recognition, they were just looking at fun ways of using it, and maybe some of the other implications are now coming to light.
Singer: If you think about a face print, it’s the facial equivalent of your fingerprints. You might want to use your fingerprint for fun ways, but we viscerally know that your fingerprint has all kinds of implications, and it’s unique. Even the notion of Facebook turning on facial recognition by default for fun things is problematic, first of all in Illinois, but also to a lot of people who are concerned that the problem with facial recognition is that it can be used to identify people without their knowledge at a distance. It implicates your ability to be anonymous, both online and in public.
Stewart: Do you think there’s a chance we’ll see companies defaulting to not collecting any data like this at the risk of it somehow being regulated in the future?
Singer: Certainly, I think companies that are having similar technology and that have Illinois users must be thinking about it. Think about doorbell cams, which are spreading everywhere. Your doorbell camera is going to take images of people. It’s used for facial recognition so that it can recognize [for example] your babysitter so that they can come in. In Illinois, you need consent for stuff like that. What does that mean for all kinds of devices that collect ambient biometric data? I think we don’t have the answer to that. I think that companies, now, because of the size of the Facebook settlement, are going to have to think about what their “Illinois plan” is. I can’t imagine that they want to have one set of software for Illinois and another for the rest of the country. First of all, that’s just, from a business perspective, inefficient. But second of all, you don’t want people in other parts of the country saying, “Well, you turn that off for Illinois. Why haven’t you turned it off for us?” We’re seeing, for example, with the California [Consumer Privacy Act], that a number of companies — Microsoft, Apple — have said, “We’re going to honor the rights under the California privacy law for everybody in the United States.”
Stewart: It feels like when we talk about these privacy stories, we talk about the value of your privacy or the value of your data, and it’s always very hard to put a number on that. Do you think things like this Illinois settlement are starting to move in that direction and putting a dollar value on these things?
Singer: I think the notion that there’s a trade and you’re trading your data for services — giving up some of your privacy — is the Silicon Valley framing that you’re getting value for data. But when we think of a trade, you know what you’re giving up and what you’re getting. When you give up your data, you have no idea how it’s going to be used. It can be used perennially for all time for all kinds of purposes, for which you didn’t give it. I would argue that it’s not a trade, it’s not a fair trade. I would also argue that data is not property. In Europe, security and privacy are fundamental human rights. If we think of other fundamental human rights, like the right of free speech, you wouldn’t say, “I’m trading my free speech for Facebook services.” This notion that we’re trading our right to privacy for services, I think, is just a problematic framing.
Related links: More insight from Jack Stewart
If you haven’t read it, set aside a few minutes to fully digest the New York Times story about Clearview AI, a little-known company that created an app for law enforcement to use facial recognition. As the article says, “without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year, according to the company.” And while Big Tech companies are at least facing some scrutiny for their facial recognition efforts, this company seems to have flown under the radar until now.
“So what,” you might think. “I’ve done nothing wrong. What does it matter if my face is recognized in public?” Privacy campaigners would argue that we’re not aware yet of what we’re giving up. The New York Times took the feeds from three public webcams in New York and photos from employer websites. It matched 2,750 faces and found a professor on the way to lunch with a job candidate.
As the Times says, “our experiment shows that a person equipped with just a few cameras and facial recognition technology can learn about people’s daily habits: when they arrive at the office each day, who they get coffee with, whether they left work early.” The professor’s reaction was “Oh, my, God. That’s unbelievable.”