🖤 Donations of all sizes power our public service journalism Give Now
3 years after Europe’s GDPR, what’s changed in tech privacy?
Jun 2, 2021

3 years after Europe’s GDPR, what’s changed in tech privacy?

Fines on Big Tech companies are not the teeth of the landmark data-protection law.

It’s been three years since the European Union’s General Data Protection Regulation, or GDPR, took effect. At its core, the law was meant to give consumers more control over how companies collect, share and use their personal data. It was the first major privacy law with real teeth in the form of potentially large fines for companies that didn’t comply.

But that didn’t really happen until recently. I spoke with Jessica Lee, who advises companies on privacy as a partner with the law firm Loeb & Loeb. She said consumer advocates tracking enforcement have been somewhat disappointed. The following is an edited transcript of our conversation.

Jessica Lee (Photo courtesy Loeb & Loeb)

Jessica Lee: We’re starting to see fines escalate now, but it’s three years later. And I don’t really think the fines are where the teeth of the GDPR comes from.

Amy Scott: Where is it, then?

Lee: If you look around now, I would say that Google and Apple are probably two of the largest privacy enforcers, because they’re the biggest players, they’re the largest fish. They’re getting the most pressure from a number of fronts, privacy being one of them. And the changes to some of their platform terms and policies, I think, will push the needle forward on privacy a lot faster, actually, than some of the regulations have. But that starts with a law like the GDPR and ends with Apple requiring opt-in to tracking or Google sunsetting third-party cookies.

Scott: One thing people have probably noticed is when they visit websites they often see a banner telling you basically that a site is tracking you and giving you some choices about what kinds of cookies you will allow. Are people actually clicking through?

Lee: [Laughs] I really don’t think so. The cookie banner is my least favorite aspect of privacy regulation. So first, no one knows what a cookie is. You might have some general concept, but the average consumer doesn’t know. It requires a deeper understanding, I think, of how the internet works than you can really grasp in the 30 seconds you’re probably willing to spend on whether or not you click OK or accept the cookie banner. And then, even beyond that, in most cases, you have to click into something and then make some choices. Sometimes it’s selecting which partners you can share data with, for example. And that level of granularity, again, I just don’t think is very consumer-friendly. I think it leads to decision fatigue. And having to make that on a website-by-website basis without having a true understanding of the decision you’re making, I think is not the best outcome for anyone, really.

Scott: What would you advise a person who just wants to read the story or buy the thing or whatever? Would you say [to] select just the necessary cookies? Is that kind of an OK default?

Lee: I would say you can select just the necessary cookies. I mean, you can also think about what the site is and do you have a relationship? And certain sites need to sell advertising in order to make their content free, so if you want to be able to continue to access the site. And in some cases, we’re seeing paywalls start to go up. So if you don’t allow cookies, there are sometimes consequences to that. And I don’t mean consequences in a negative way, necessarily. But if they’re not able to monetize through advertising, you might be asked to pay for access to a website. And so I think in that case, that might shift the way you make a decision. So necessary is always, I think, acceptable because those are cookies that make the website function. And beyond that, think about what relationship you have to that site, and do you trust the company behind it?

Scott: How would you say the global conversation about privacy and attitudes of our privacy have shifted?

Lee: Well, the EU looks at privacy as a fundamental human right, kind of how we look at the First Amendment. And I think that globally, we’re starting to have more conversations around privacy as a human right inherent to the individual as opposed to a consumer-protection issue, which is how we typically looked at it in the U.S. previously.

Scott: That’s really interesting. What are the implications of that?

Lee: Well, I think we’re starting to see laws roll out. And beyond just what goes on the books, it’s really our culture, our sentiment towards privacy is that these are rights that are inherent to individuals. So it’s not just some check-the-box thing that happens that you have to do, you put it in the privacy policy and hope no one sees it. But that consumers should have more control over your data. And so whether it’s something like the data dividend that was proposed by [former presidential candidate] Andrew Yang and some others in California, or it’s just enhancing additional privacy rights, there’s a greater awareness of privacy and I think a focus on individuals owning their data and having control of their data in a way that we weren’t talking about before.

Scott: So to get to the bigger takeaway from these past three years, what lessons do you think have been learned that might inform future regulations?

Lee: I think one place where we’re going to see some attention being paid is how do you enforce these laws? Once you put the laws on the books, what does enforcement look like? And what should the role of enforcement be in terms of driving changes of behavior? Are fines really the best avenue? Is it injunctive relief? Is it disgorgement? We’ve seen the [Federal Trade Commission] try that. It might be, in some cases, that having data taken away that you’ve collected is a bigger burden than actually paying the fine, particularly for larger companies who build these fines into their budgets. And I think there’s also gonna be attention paid to what’s really best for the consumer and how can you help facilitate the consumers’ ability to exercise these rights we’ve given them? Most privacy laws, GDPR included, have this full suite of rights, but it’s great to say you have a right, but how do you actually help to educate the consumer on what those rights mean, how to exercise them and when to exercise them? So I think that’s two places where at least I envision regulators spending time, trying to tighten up or button up or amend or enhance our existing laws.

Related links: More insight from Amy Scott

There have been some interesting developments recently in how and where companies are allowed to send the data they collect. Last month, Ireland’s high court upheld a preliminary decision that could force Facebook to stop sending information about its users in Europe to its servers in the United States. The Wall Street Journal reported on the decision, saying it could have implications for other large tech companies that send data across the Atlantic, like cloud services and email providers, with potentially billions of dollars at stake. Many tech companies have their European headquarters in Ireland, so that country’s Data Protection Commission has a lot of sway.

Meanwhile, our cars are collecting and storing more and more data as we drive them. And China is cracking down on how automakers store that information. Reuters reports that some Chinese government workers were recently advised not to park their Teslas inside government compounds because of concerns about what the vehicles’ cameras might record. A draft rule from the Cyberspace Administration of China would require automakers to get customers’ permission before collecting driving data and to store that data locally. Last week, Tesla agreed to store all of its data from its Chinese cars in Chinese data centers. China is the biggest car market in the world.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer