As part of a series on extremism on the internet, Marketplace’s Molly Wood explores how people get radicalized online. Read about what the responsibility is of social media companies to curb this type of activity (if any) and why it really is not that hard to block the content.
This week Marketplace Tech is looking at how an online troll becomes a terrorist, how people are recruited and radicalized online through social media and how companies can deal with it. Host Molly Wood talked with Fathali Moghaddam, a professor of psychology at Georgetown University. In 2005, he published a paper called “The Staircase to Terrorism.” It was an exploration of how, out of millions of disgruntled people in the world, a very few rise up this metaphorical staircase and commit violent acts in the real world. He says radicalization isn’t new, but the internet can make it faster and easier. The following is an edited transcript of their conversation.
Molly Wood: What are the things that move people, ultimately, to violence?
Fathali Moghaddam: It’s a slow process sometimes, but it can also be rapid. The key issues are that individuals feel that they are being mistreated, that there is injustice in the world. Particularly nowadays, through the internet, an echo chamber has developed and isolation takes place, so this group radicalizes. The radicalization takes place in relation to other groups. It’s what I call mutual radicalization. Gradually they get to a stage where one or two of them are ready for actual violent behavior.
Wood: It sounds like what you’re saying is that, not only is the internet particularly ripe for this type of behavior, but that, in fact, as online communities become radicalized, they radicalize each other. They up the ante.
Moghaddam: Absolutely. This is becoming the norm. We see this in politics. We see it in extremist ideology. There is a relationship between these radicalizing movements and the internet. This is taking place in the wider context of globalization.
Wood: You’ve said that the latter stages of this radicalization staircase are somewhat rare, but do you see this process becoming faster or more common?
Moghaddam: Yes, definitely becoming more common in the sense that the mutual radicalization taking place, the probability of more individuals moving up to that final level of the staircase increases. This is very much related to the type of migration taking place around the world. The kind of leadership coming through now is quite different from what was taking place 10 years ago.
Wood: How can individual companies and platforms combat this radicalization process on their networks?
Moghaddam: The large media corporations have a duty to take on this challenge. They have the tools available. They have the technical means to do this. They have to get more serious about controlling hate speech. They have to get more serious about controlling echo chambers and influencing echo chambers.
Related links: more insight from Molly Wood
Our guest Fathali Moghaddam also has a new book coming out called “Threat to Democracy: the Appeal of Authoritarianism in an Age of Uncertainty.”
CNN has an interesting piece this week about online radicalization and what big tech platforms can do about it. They spoke with a terrorism expert who says simply taking down extremist content just shifts the content and the audiences to smaller platforms. The chat app Gab is one example, which apparently is a favorite of neo-Nazi groups, anti-Semitics, racists and conspiracy theorists. The expert said algorithms shouldn’t amplify extreme content to begin with. But if they know it’s there, they could redirect people to forums with different viewpoints or even therapy resources.
In the United Kingdom there’s a startup called Moonshot that is trying to intercept extremist content with what they call “offramping” campaigns. These campaigns include sending messages that direct people to more positive content and even intervening online to connect potential extremists with offline help.