When online forums become terrorist networks, how do we deal with them?
Aug 6, 2019

When online forums become terrorist networks, how do we deal with them?

HTML EMBED:
COPY
Cybersecurity expert Mark Rasch says law enforcement is still working on how to treat domestic terrorism online.

The online forum 8chan was mostly offline yesterday. It was booted from several big web platforms after the weekend’s mass shooting in El Paso, Texas. The Walmart rampage was at least the third mass shooting this year to be announced in advance on 8chan, which has become a haven for far-right extremists. 

The site’s own founder has said it should be shut down because it contributes to radicalizing killers. The question for law enforcement is: Should forums like this be treated as terrorist networks? Host Molly Wood spoke with Mark Rasch, who’s a cybersecurity expert monitoring extremist threats online for law enforcement and private companies. He said that law enforcement is still working on how to treat domestic terrorism online. The following is an edited transcript of their conversation.

Mark Rasch: One of the things that we’ve done in the international terrorism arena is that we have made certain crimes of providing material support to terrorist organizations — and there’ve been a number of cases, some successful, some less so — involving going after websites that encourage jihadi activity as providing material support for terrorism. One of the things we can do is we can look at how much support by either platforms or websites that support this is too much and runs into providing material support for terrorism. We rarely do that in the area of domestic terrorism, and I think that’s something that’s worth looking into.

Molly Wood: Do you see that conversation evolving in other ways, the idea for law enforcement to think about these chat networks as terrorist networks even when domestic terrorism is involved?

Rasch: One of the problems with domestic terrorism in these internet forums is that, on the one hand, you want to keep them up and running so you can collect intelligence, so you can see what these people are saying, see what thoughts are radicalizing them, monitor the people who are on the sites so you can get an idea of what these people are saying and doing and thinking. On the other hand, you also want to shut them down so they can avoid this echo chamber where they each reinforce and radicalize each other. You have to strike the appropriate balance between the two. Also, you have to decide at what point the speech stopped being First Amendment-protected activity and start being incitement to violence. And that’s a very difficult line for law enforcement and for courts to take a position on.

Wood: If there was all the money and all the time in the world, would it be possible to identify these threats before they happen?

Rasch: It is possible to be able to identify some of the threats. And I think what you look for is you can identify extremists, you can identify people who are exchanging conspiracy theories to be able to determine when somebody is likely to become violent and when they are likely to become aggressive is really very difficult. That’s almost a metaphysical question, and it’s a psychological question. There are computer algorithms and artificial intelligence programs that are supposed to help, but I would be very loath to start using those right now without knowing more about how they work.

Related links: more insight from Molly Wood

That FBI memo that Yahoo News published said that it was the very first memo to look at this question, which I want to reiterate, feels a little late. The document said the speed and reach of social media, and the deep political divisions in the country, were likely to lead to more conspiracy theory-driven violence. In a statement to Yahoo News, the FBI said it can’t monitor websites or social media without probable cause or start an investigation based solely on First Amendment protected speech. That memo didn’t say anything about white nationalism or white supremacy, but the Yahoo News story notes that during a Senate Judiciary Committee hearing last week, Democrats criticized the FBI for not putting enough attention on white supremacist violence as a cause of domestic terrorism.

Foreign Policy has a story from May about what Mark Rasch was talking about — how Western governments have what the site calls a double standard when it comes to dealing with global, and mostly jihadi, terrorism, compared to right-wing extremism, white supremacy and white nationalism.

One example of the tension is in a Vice article from last April about how Twitter employees are asking why the platform can ban ISIS accounts but not white supremacist material.

In fact yesterday, the #untwitter8chan was trending because apparently while the site was down, its users were turning to Twitter, where 8chan has a verified account, to get updates.

Here’s a series we did after the shootings in Christchurch, New Zealand, on how a troll becomes a terrorist and how online radicalization happens.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Thanks to our sponsors