Social media has been radicalizing people for years
Jan 18, 2021

Social media has been radicalizing people for years

HTML EMBED:
COPY
After the violent attack on the U.S. Capitol, some are still surprised that online speech has real-life consequences.

Back in March 2019, a gunman killed 51 people at two mosques in Christchurch, New Zealand, and streamed the whole thing on YouTube. After that event I took a weeklong look at how social media radicalized people to violence, and how a troll becomes a terrorist. Now, nearly two years later and after a violent attack on the U.S. Capitol, there still seems to be some surprise that online speech leads to offline consequences, so I wanted to revisit some of what I heard that week.

Becca Lewis studies online extremism at Stanford University. She said part of the process used to draw people in is to start with the notion that extremist content is just a joke.

“Far-right communities online have, for several years now, been using humor specifically as a recruitment tool. In fact, a really popular neo-Nazi website, a couple of years ago, its style guide got leaked. The founder of that website explicitly wrote that humor is a recruitment tool because it gives the far-right extremists plausible deniability about their extremist beliefs,” Lewis said.

One of the people arrested at the U.S. Capitol argued in a hearing last week that he was joking when he said online that he wanted to shoot House Speaker Nancy Pelosi, for example. Such “jokes” can be just a first step.

Back in 2005, Fathali Moghaddam, a psychology professor at Georgetown University, published a paper called “The Staircase to Terrorism.” It explored how there are millions of angry people, but some of them end up committing violent acts. He said the internet can make climbing that staircase happen a lot faster.

“The radicalization takes place in relation to other groups. It’s what I call mutual radicalization. Gradually, they get to a stage where one or two of them are ready for actual violent behavior,” Moghaddam said.

That is why, for example, the crowd at the U.S. Capitol included QAnon true believers and people actually giving speeches and promoting misinformation about the coronavirus. Now, since the attack, we’ve seen social media act to block a lot of content and accounts and apps after spending years arguing that it was too hard to stop because the platforms were so big.

Dipayan Ghosh used to work on global privacy and public policy issues at Facebook. Now, he’s a researcher at the Harvard Kennedy School. He said it’s obviously doable.

“When they do decide that they want to throw their money behind some commercial effort, let’s say, for example, the development of [artificial intelligence] to better target ads, they’re throwing dollars behind that. I would argue that they should be doing this in exactly the same capacity, at exactly the same level,” Ghosh said.

For the first time in the last couple of weeks, we did see something closer to that level of effort — accounts taken down, hashtags blocked and groups shut down — although not as aggressively as some would prefer. And now, I would argue, we’re about to see if the changes are temporary or whether real changes are in the works.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer