Marketplace Tech Blogs

More extremists are getting radicalized online. Whose responsibility is that?

Molly Wood Mar 19, 2019
HTML EMBED:
COPY
Bellingcat, a research and investigative journalism site, is collecting and archiving social posts that show videos and images of rioters who attacked the U.S. Capitol last week. Kirill Kudryavtsev/AFP/Getty Images
Marketplace Tech Blogs

More extremists are getting radicalized online. Whose responsibility is that?

Molly Wood Mar 19, 2019
Bellingcat, a research and investigative journalism site, is collecting and archiving social posts that show videos and images of rioters who attacked the U.S. Capitol last week. Kirill Kudryavtsev/AFP/Getty Images
HTML EMBED:
COPY

As part of a series on extremism on the internet, Marketplace’s Molly Wood explores how people get radicalized online. Read about how internet “echo chambers” can lead to faster radicalization and why it really is not that hard to block the content.


The man accused of killing at least 50 Muslim worshippers in Christchurch, New Zealand, last week seems to be, in many ways, the ultimate example of online radicalization. Researchers and social media experts have warned for years that there is a playbook for turning trolls into terrorists. Host Molly Wood talked with Becca Lewis, a research affiliate at the nonprofit institute Data & Society, who studies extremism online. She said part of the process to draw people in is to start with the notion that extremist content is just a joke. The following is an edited transcript of their conversation.

Becca Lewis: Far-right communities online have, for several years now, been using humor specifically as a recruitment tool for young, disillusioned people and particularly young, disillusioned men. In fact, a really popular neo-Nazi website, a couple of years ago its style guide got leaked. The founder of that website explicitly wrote that humor is a recruitment tool because it gives the far-right extremists plausible deniability about their extremist beliefs.

Molly Wood: It must make it harder for platforms to scan for this content. It’s one thing if it’s very obvious terrorist content, but this is vague.

Lewis: Far-right extremists have become really savvy at knowing exactly how to stay within the lines. They’re really good at masking things, using dog whistles, phrasing things in ways that obscure the violence behind their ideology.

Wood: Can you give us an example of what those sorts of dodges might look like?

Lewis: Yeah, there’s a couple that were really popular during the lead-up to the 2016 U.S. presidential election. One of them was the OK hand symbol. They were trying to troll journalists into saying that this was a white nationalist hand symbol. The idea was that it wasn’t actually a white nationalist hand symbol, but it was something that allowed them to have it both ways. They had plausible deniability but still were drawing people into the fold through it.

Wood: There really is a playbook for identifying people in these communities and picking them out and radicalizing them over time?

Lewis: That’s absolutely right. A lot of these groups will end up targeting young, disillusioned men who are people who feel like they have been left behind by the system. These groups can end up providing them a community that feels similar to them. But the problem is then they can start to feed them explanations for who’s to blame for their situation. Then they may be more likely to start placing the blame on other groups, like immigrants from predominantly Islamic countries. Providing community can really quickly shift into more radicalized discourse.

Wood: What do we do? How do businesses and society combat this?

Lewis: I don’t pretend to have all of the answers, but I think the first step is for the platforms to treat this as the serious issue that it is. There was a time when ISIS videos and ISIS content and propaganda were proliferating on all of these platforms. They actually have been quite successful at tamping down on that content and making it far less accessible and far less of a problem. Platforms really need to prioritize and give resources to white supremacist content as an issue and how to handle that and grapple with that.

Wood: You use the example of ISIS content, which seems to have been pretty effectively combated without damaging the business model of these companies. Should we buy the argument that the business model will inevitably lead to this type of content no matter what?

Lewis: I think that you start to get really quickly into political terrain here, but these companies are American companies. I think that Islamophobia is, unfortunately, really widespread in our country right now. I do think that this content has more of an economic impact than ISIS content and ISIS videos, which were, necessarily, a bit more fringe than the content that you see right now making its way around YouTube that could have potentially started people down these rabbit holes. I would argue that they have come into this with blanket economic incentives and haven’t considered how it will have unintended consequences in other spaces. The attention economy can be really great. When you’re driving content based on clicks and views, it can lead people to a lot of engaging, interesting content. But I also don’t think it’s particularly conducive to messy, longer-term democratic engagement. It’s really designed to get people to have an emotional reaction to a story and click on it. I don’t know how we grapple with that because, a lot of these platforms, that’s the business model that they’re based on. I think that it’s time to have some really tough conversations about the unintended consequences of that business model for our democracy, for populations that are experiencing hate crimes, and start to think about how to temper some of the negative consequences.


Related links: more insight from Molly Wood

Marketplace Tech will take an in-depth look at the business question around radicalizing online content the rest of the week. We know that social media didn’t create terrorism or white supremacy or extremist ideologies. But we also know that these companies can profit off this type of content. We know that the algorithms of these companies tend to prioritize increasingly extreme videos and posts, whether it’s dumb stunts on YouTube or conspiracy theories or divisive propaganda that can end up reaching billions of people.

Our guest, Becca Lewis, wrote an op-ed last fall about how YouTube, in particular, tends to quickly lead watchers to extremist content with its recommended videos tab. She wrote that this has actually become part of the influencer economy. People with big followings find increasingly engaged audiences by broadcasting either more extreme content or collaborating with more extreme creators until their viewers end up in what Data & Society refers to as an alternate influence network. But the videos generate views, which generate ad dollars for all involved. Researchers say YouTube has been very slow to act.

The Washington Post had a story yesterday from YouTube’s perspective, about how its team was trying to find and remove tens of thousands of videos of the shooting in Christchurch. At one point, YouTube said the video was being uploaded every second. Facebook said it removed or blocked 1.5 million videos of the violence as well.

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.