Law enforcement officials in the United Kingdom, Australia and the United States, including U.S. Attorney General William Barr, wrote an open letter to Facebook last week, asking it to hold off on plans to expand end-to-end encryption in Facebook Messenger. That kicked off a heated debate about privacy and public safety.
Organizations like the ACLU, Human Rights Watch and the Freedom of the Press Foundation responded with a letter of their own, urging Facebook to move ahead, arguing encryption boosts public safety and democratic values.
I spoke with Matthew Green, who teaches cryptography at the Johns Hopkins Information Security Institute. He said this tug between privacy and public safety dates back decades but is newly urgent. The following is an edited transcript of our conversation.
Law enforcement has realized that [encryption] is potentially a threat to their ability to do wiretapping and accessing data using the tools they’ve become accustomed to.Matthew Green, associate professor of computer science at Johns Hopkins University
Matthew Green: Up until just a few years ago, we didn’t even think about this problem, because almost nobody was encrypting anything. Very recently, services like WhatsApp and Apple’s iMessage have started using encryption by default. Suddenly, law enforcement has realized that this is potentially a threat to their ability to do wiretapping and accessing data using the tools they’ve become accustomed to. There’s a request, at least from the law enforcement community, to do something about this, to weaken or limit encryption to some extent.
Tracey Samuelson: Is there a way that information could be both encrypted and private while still being accessible to law enforcement for public safety uses when it needed to be?
Green: That’s kind of the holy grail. It’s like asking for a cake that tastes delicious and is wonderful but won’t make you gain weight. Everybody wants to build that thing. Nobody is against it. It’s just that every time we’ve tried it — and people have been trying since the ’90s — it hasn’t worked. It’s had some fatal flaw. The result is that everybody is very nervous about this. The government won’t propose anything. They want industry to do it. And industry doesn’t really know how to do it, so we’re at an impasse as a result.
Samuelson: Where do you think this is going, or how do you think that this might play out, this question of government access versus encryption?
Green: This is a debate that’s been going on since the ’90s. It’s been a repeated question that comes up. The government says, “We want to have access to this encryption.” So far, the general decision the public has made is the answer is no. We have law enforcement; we have traditional ways to do law enforcement. We’ll take our chances. What’s different here is that now there are a lot more people using encryption than we’ve ever had before. This whole thing is coming to a head. So what’s the future? I don’t know. We could end up in a world where this kind of encryption is illegal, and it really depends on what happens in Washington, D.C.
Related links: More insight from Tracey Samuelson
How do we know what we do know about the U.S. government’s ability to surveil the online world? Well, a lot of it comes from documents Edward Snowden leaked more than six years ago. Snowden is out with a new memoir, and Matt Green recently wrote a blog post about the legacy of the Snowden leaks.
Also, the Washington Post has an interesting piece examining whether child exploitation is a valid argument in opposing strong encryption.
The future of this podcast starts with you.
Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.