Time is running out to make a gift to Marketplace and help us meet our $100,000 fall fundraiser goal by midnight tonight.
Facebook and several other platforms have banned President Donald Trump indefinitely. Twitter banned Lin Wood, Trump’s conspiracy theory- spouting lawyer, but new conspiracies are spreading — like antifa was actually behind Wednesday’s deadly events at the U.S. Capitol. All of it is fueling the question of how to deal with hate speech and online radicalization.
It’s a topic for “Quality Assurance,” where I take a look at a big tech story. I spoke with Heidi Beirich, co-founder of the Global Project Against Hate and Extremism. She said historically in the U.S., hate speech has been treated like any other speech. The following is an edited transcript of our conversation.
Heidi Beirich: Well, there’s no question that in large parts of Europe hate speech is actually banned because there has been an understanding, especially, for example, in Germany, that hate speech will literally lead to incitement of violence. And in the case of Germany, genocide. In the United States, we’ve never had that kind of an understanding of how speech and violence are directly connected, especially when that speech is coming from political leaders. I just think that it’s a misconception about this problem that we have in the United States that we’re going to have to start confronting.
Molly Wood: We now have the events of this week behind us. We have a new Democratic president. Democrats now control both houses in Congress. Do you think we could see actual legislation here around hate speech and online speech?
Beirich: I think the changes are absolutely coming. I know that there are several people on the House side who are looking at holding hearings about disinformation, misinformation and hate speech. There’s a movement afoot to look at removing the protections of Section 230 [of the Communications Decency Act] to some extent, perhaps modifying that, so that certain kinds of speech can’t be hosted on the platforms. And frankly, President-elect Joe Biden has talked since the first campaign ad he put out that we have to do something about white supremacy in this country. And if he decides to take that seriously, and perhaps appoint someone to coordinate that work, a big part of that is going to be about de-platforming, ridding the platforms of white supremacy so that people can’t get sucked into that orbit.
Wood: What is the potential downside? Because certainly the other side of moderating speech is censorship. And one of the concerns about changing [Section] 230 is that platforms will essentially overcensor so that they don’t run afoul of any new laws.
Beirich: I think that that’s a critically important issue. We do not want to reduce the ability of people to speak out in legitimate ways as a way to deal with hate speech. And there is very likely going to be the case that these companies will over-de-platform with these changes. So I think we have to move very, very slowly and deliberately if changes are going to be made. The problem is that the companies have shown themselves to be completely incapable of self-regulating on these issues. They have conflicting standards, content-moderation systems that don’t work, lack of transparency. So somehow, we have to muddle through these issues without ending up with the kinds of scenes that we saw in Washington, D.C., and the levels of domestic terrorism that we’ve had in this country over the last few years by people who are radicalized online. But being careful and cautious is clearly important.
Wood: And how important is the business aspect of this competition, the fact that these companies are pretty much the only game in town?
Beirich: Well, I think that’s a big part of it. I mean, one of the arguments behind the current antitrust cases against Google and Facebook, and so on, is that if the companies didn’t dominate their markets, if they were broken up, shrunken somehow, that there would be other competitors that would arise who may have much better policies when it comes to this type of extremism. And we should let those competitors flourish. But the way it works right now is these social media companies, largely, have monopolies in their areas, not just in the United States, but around the world. So there’s no pressure on them to be better at watching what gets up on their platforms.
Wood: Do you think it’s allowed platforms to sort of get away with this slow refusal to self-moderate, that we also don’t take this stuff seriously as a society? I mean, the events of January 6 were planned online, there was merchandise, there was all kinds of discussion about it — it couldn’t have been less of a secret. And yet, it still seemed to come as a surprise. What does it take for people to believe that the stuff that people are saying online is the stuff that they’re also saying offline?
Beirich: I don’t think there’s been a serious enough reckoning of how many millions of people have been radicalized into ideas that are inherently violent, like calls for civil war, for example, for insurrection, for assassinating political figures. And I think some of this goes back to our ideas in America about the First Amendment, that good speech can crowd out bad speech and sticks and stones may break my bones, but words will never hurt me. It’s just false. And we have to learn better for it, because you’re right. Experts on extremism have been warning about the dangers of all of this for a long, long time. So we shouldn’t have been surprised. And yet, I think many Americans were.
Wood: How much do you think that racism and sexism plays into it? I mean, there’s this question of whether we should have addressed online speech in 2014 with Gamergate, but it was just like “women being harassed, which always happens.”
Beirich: Well, I think that’s a big part of it. Women being harassed, people of color being harassed? “Oh, well. That’s just part of the society.” The people who hold the levers of power weren’t the ones on the receiving end of that kind of behavior. And so they didn’t take it very seriously. I mean, almost all of the social media companies are run by white men with libertarian backgrounds and very little knowledge of history or social movements or civil rights. And they brought that mentality into those companies. And so the discussion about the issues we’re having here were not at the top of the list of things they should have been thinking about when they were creating these platforms.
Related links: More insight from Molly Wood
Let’s be honest, we know that there is a massive disconnect at this point between people who spend a lot of time online and people who don’t. There are two realities happening here, and it is getting weirder and weirder to try to straddle them. I mean, you have people, a lot of them, who believe in QAnon conspiracy theories, or that the U.S. election was rigged, that the coronavirus is a government plot. They’re even now being told and unquestionably agreeing that the people who stormed the U.S. Capitol were not actually pro-Trump extremists who attended a rally with the president, in which he essentially told them to do almost exactly what they did, right down the street from the Capitol. Instead, the new theory goes: They were leftist activists and antifa who staged the Capitol invasion to hurt the president.
And on the other hand, there are apparently members of the military and law enforcement who are claiming they did not know that there were Facebook groups and hashtags encouraging the mob at the U.S. Capitol, or now have seemingly no idea how to find the people who mobbed the Capitol and put it all on their Instagram accounts. Guys, the hashtag was “storm the capitol.”
And the reason you’re seeing so much frustration from people who cover this universe of misinformation and disinformation is that the people in the reality that don’t seem to have the internet just don’t believe us any more than the people who think the election was stolen believe that it wasn’t. They keep watching all this stuff happen, the destruction of shared reality, the actual violence and societal destruction that is resulting. I mean, four people died on Wednesday, and when you try to tell them it’s because millions of people believe in QAnon and they think a storm is coming that will overthrow a corrupt government — and by the way, this is now the belief system of a whole bunch of Republicans, including some who are in Congress and who still voted to overturn the results of the U.S. election on Wednesday and early Thursday morning. When you tell them that, they laugh or say, “You’re overreacting” or “It’s just trolls or politics” or “You need to delete your Facebook and you’ll feel better.” I mean, you might feel better, that’s true. But the giant rip in the reality fabric? That’s not going anywhere.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.