Are pro-Trump extremists’ messages more dangerous if they’re encrypted?
Jan 20, 2021

Are pro-Trump extremists’ messages more dangerous if they’re encrypted?

They aren't as easily shared, but it can be harder for law enforcement to see what extremists might be planning.

The app Parler is offline, and Facebook and Twitter are tamping down extremist speech on their platforms. More people are migrating to apps like Signal, which encrypts messages between parties, and Telegram, which can. That blunts the power of extremist messages, but it also makes it harder for law enforcement to see what they might be up to, reigniting a yearslong debate about encryption itself.

I spoke with Alexandra Givens, the president and CEO of the Center for Democracy & Technology. I asked her about the trade-offs. The following is an edited version of our conversation.

A headshot of Alexandra Givens, president and CEO of the Center for Democracy and Technology.
Alexandra Givens (Photo courtesy of Georgetown University)

Alexandra Givens: In a way, the relocation of these conversations to smaller, private group settings is actually exactly what the major platforms were hoping for. It removes the conversation from the public square, where new people can be recruited and radicalized. Of course, on the other hand, moving to smaller, private settings makes it much harder for disinformation researchers and law enforcement to understand what’s going on and to respond to it. And that’s a really challenging part of the problem here.

Molly Wood: And this has been a complaint of law enforcement for a long time, right, that these platforms reduce visibility and, of course, the counterpoint is yes, but they increase privacy?

Givens: Yeah, so it’s a really active debate. When we look at this, we really do see it as a dangerous trade-off. Journalists, human rights activists around the world, members of Congress, doctors engaging in telemedicine, they all rely on encryption to protect their communications, and it’s a tool that’s used around the world in really important ways. And so one of the trade-offs, or the balancing, that we have to think about is the impact potentially on law enforcement being able to go after the worst of the worst of these activities, versus the huge benefits of that technology to so many other forms of expression and communication around the world.

Wood: One could argue that we have just lived through, and are still living through, a five-year rolling experiment in disinformation and amplification. And I wonder, do we have a sense now of what happens when, for example, these conversations have to go small and go private? Is there an impact when lies aren’t amplified?

Givens: When conversations are happening in these smaller, private messaging apps, it does help in terms of reducing the reach for this type of content. Those platforms are more private, they’re quieter, it doesn’t participate in the general traffic in a way that lets people be recruited — so that’s one point. The other is that there are ways, even with encrypted messaging, that law enforcement and others, including the platforms themselves, can help respond to the worst abuses. So first of all, groups and users can still be reported. If information is discovered, whether through an individual reporting, or journalists or law enforcement or researchers who go undercover in these groups, platforms can respond and take them down for violating their terms of service.

Wood: Right, [so] you still have to do police work.

Givens: Yeah, exactly. So the police work has to continue and can continue. The second is that some tools — and WhatsApp has done this recently — can restrict things like message forwarding within these private apps to help reduce virality. WhatsApp announced this last fall as one of their efforts to respond to misinformation. They said that any message that had been forwarded five or more times now has a new limit, so that [it] could only be forwarded to a single person at a time as opposed to large groups. So there’s interventions that the platforms can do while still respecting the privacy and security of the messages to still make sure that the platforms are being used, or mitigating the potential worst abuses.

Wood: Are you aware of any ongoing efforts or existing legislation to try to require companies to provide access to encrypted messaging?

Givens: There is an increasing amount of pressure. For example, a bill that was introduced last year that will likely be reintroduced in the new Congress called the EARN IT Act, which is trying to target child sexual abuse material online — obviously, just a horrendous problem. One of the challenges, though, is that that legislation, in its original formulation, tried to get at this by holding platforms potentially liable if they should have known about child sexual abuse material on their platforms. And they didn’t, because they offer encrypted messaging. That’s really problematic because that’s going to disincentivize communications platforms from offering that type of secure and private messaging. So one of the things that we focused on is saying, “Yes, child sexual abuse material is a horrible problem. Let’s focus on resources for law enforcement,” as opposed to having, again, these really tough trade-offs that end up jeopardizing free speech and communication in such important ways.

Related links: More insight from Molly Wood

Some security analysts say it’s going to be hard for the FBI to ask for encryption backdoors when they missed an assault on the Capitol that was planned right out in the open.

I should point out here that another reason for this mass migration to Signal and Telegram is because of this big drama about WhatsApp, which does offer end-to-end encrypted messaging but is, of course, owned by Facebook. It recently announced a change to its terms of service that would stop letting its users opt out of sharing data from WhatsApp with parent company Facebook. This is not your chats, which are encrypted. But as we now know from Apple’s new privacy nutrition labels, there is a ton of metadata about you that can be shared, like what device you use, your location, financial transactions and maybe contacts. And Facebook basically said people would have to agree to the new terms or have their accounts canceled by Feb. 8. Facebook has since delayed this policy change because everyone freaked out, including Elon Musk, who told everyone to go to Signal. Dude, everyone did. Signal was not working very well for a little while, so thanks a lot, Elon. Some of us had already been there.

But anyway, the whole saga, in case you missed it, has led to Facebook battling what was essentially a big misinformation campaign about WhatsApp, because a lot of people thought it was going to share their private chats with Facebook. And anyway, we all agree the irony is crushing, but here’s another crushing irony: People have swarmed to Signal and Telegram, believing they both offer the same level of encryption when Telegram isn’t actually end-to-end encrypted by default, and you can only turn it on for personal chats, not groups. But I guess that’s good news for the law enforcement folks, who are worried about not being able to see what potential domestic terrorists are up to, as long as they don’t figure out that Telegram isn’t everything they thought it was.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer