Law enforcement officials generally aren’t fans of what’s called end-to-end encryption — messages that can only be read by the sender and the recipient. They call it “going dark” and argue that encrypted communications make it harder to investigate or uncover crimes.
The Trump administration has held meetings about a proposed ban on end-to-end encryption, and last week Attorney General William Barr said companies should have to give “lawful access” to encrypted communications — basically a back door. But security experts, and even other government officials, said that’s not a good idea.
Host Molly Wood spoke with Moxie Marlinspike, founder and CEO of the private chat app Signal Messenger, about what a ban on encryption — or giving law enforcement a back door to messages — might mean. The following is an edited transcript of their conversation.
Moxie Marlinspike: What it would mean is that people’s personal data and communication would be insecure — it would be vulnerable to hackers or just data leaks, large compromises. I think even within the United States government that that’s well understood, which is why it’s sort of confusing that the Trump administration is murmuring about this. The only thing that I can think of is that Trump and the Trump administration have a certain amount of insecurity about Silicon Valley and tech company CEOs and such.
Molly Wood: Do you think that’s really it? It seems like there is at least some — to play the devil’s advocate — legitimate law enforcement arguments about how it’s a lot easier to catch bad guys if you can see what they’re saying.
Marlinspike: The thing that’s confusing about that is that technologies like Signal, we did not invent cryptography. We’re just making it accessible. Cryptography — that genie is out of the bottle. So people who are engaged in high-risk criminal activity are always going to be able to avail themselves of cryptography if they want to. Signal is just making it available to people like you and me.
Wood: You sound like you don’t think that there is a serious threat, at least from the United States government, in terms of attempting to force loopholes or weaken end-to-end encryption.
Marlinspike: I am not a policy expert, but at this point it does seem clear that public sentiment is in the favor of more privacy and security. It’s very clear that national security and the United States government itself benefit from these technologies. So it would make sense to take a step back there at this point.
Wood: So if you had to characterize the split in the argument: On the one hand, you may have this law enforcement argument that says, “These apps provide a place for criminals to hide.” On the other hand, you might have some of the same law enforcement people saying, “It also protects our secrets.” Just to put a fine point on it.
Marlinspike: I think it’s possible that there are elements within law enforcement that are just less focused on protecting national security or protecting our own secrets or protecting industrial secrets. They’re not looking at the bigger picture.
Wood: Are the policies in Australia a problem for you? They’ve taken steps to ban or weaken end-to-end encryption. What does that mean for how Signal functions there?
Marlinspike: It doesn’t mean anything. I think that legislation was also very confusing to people at the time. So far, it seems to have just weakened Australia. There are startups and tech companies who are now reconsidering having offices there. Fewer people are probably being hired as a result. It doesn’t seem to have resulted in any clear benefit from a law enforcement perspective.
Wood: How much of this conversation, ultimately, is going to come down to trust and who people want to associate with? It’s one thing for Facebook-owned WhatsApp to say that it’s end-to-end encrypted or even Apple to tell me that iMessage is. But if I’m a consumer who doesn’t know that much about how this technology works, does it come down to who I decide I’m going to believe?
The fundamental idea behind cryptography and these types of technologies … is that you don’t have to trust anyone.Moxie Marlinspike
Marlinspike: The fundamental idea behind cryptography and these types of technologies like end-to-end encryption is that you don’t have to trust anyone. The idea is to try and remove trust in people or an organization. If something is end-to-end encrypted, you don’t have to trust the people who are hosting that data in order to do the right thing with it, because the way the technology is designed, there’s literally nothing they could do. It’s just opaque data that tells them nothing. What’s complicated is that it’s difficult for people to conceptualize that even if experts in the field tell them that that’s the case. And there is also a popular media perception of technology and cryptography that anything can be cracked or whatever. It’s like, “You just get a smart person in the room, and they type on a keyboard, and the information is available.”
Wood: I mean, they must type fast, to be fair. Otherwise it won’t work.
Marlinspike: There are a lot of questions around public perception and trying to communicate exactly what’s going on to users. But on the other hand, our objective as an organization is just to make private communication accessible and ubiquitous. From our perspective, our largest goal is just to have as many people using this technology as possible. Even if they don’t even know that it’s there. At the end of the day, that’s what’s going to protect people’s user data.
Related links: more insight from Molly Wood
Politico reported in June any legislation banning encryption was unlikely to pass in Congress. But the issue isn’t going dark anytime soon. In the story, it said that the FBI and the Department of Justice were in favor of making companies create back doors in their encryption. But the Commerce and the State Department were worried that such moves could compromise our government’s security because government agencies also rely on encrypted communications.
They also fear it could hurt diplomatic relations and economic relations, too. As another story points out, a back door in encryption would basically let the U.S. government do what it’s accusing Huawei and the Chinese government of doing — letting the government spy on the traffic that’s being routed through a private company.
In fact, efforts to ban encryption go back as far as the late 1990s. One bill that passed through a House committee but didn’t go to a full vote would have made it a federal crime to create or distribute encryption products that didn’t include a back door.
The going dark conversation went on through the George W. Bush administration and into the Obama administration, with the FBI drafting some back-door legislation again in 2012. The conversation peaked again back in 2016 after the mass shooting in San Bernardino, California, when police wanted Apple to help them open up the shooter’s iPhone by creating new software to break its encryption.
There’s no doubt that encryption can make it harder for law enforcement to do its job, but civil rights advocates said it’s not supposed to be easy and there are limits on law enforcement’s ability to spy on people for a reason. Technologists said that the problem was once you created a key to a lock — in this case, the encryption is the lock — the government or police might say they’re the only ones with the key. But because of hacking or spying or even blackmail, that’s just not the case. Once a key exists, the lock is basically no good.
The future of this podcast starts with you.
Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.