Facebook announced this week it was taking down any pages or groups tied to the far-right conspiracy theory QAnon. Previously, the company had only removed content that encouraged violence. The outright ban is a big deal because, generally, Facebook has resisted policing any content on its platforms.
So, will it work to slow the spread of baseless theories about a Satanic cult of pedophiles running the world? I spoke with Travis View, the host of “The QAnon Anonymous” podcast. The following is an edited transcript of our conversation.
Travis View: Facebook has kind of an inconsistent track record when it comes to enforcing their policies, but this appears to be pretty devastating to the QAnon community on Facebook. It’ll probably limit their ability to organize and recruit new people into the movement.
Amy Scott: Are there loopholes that they might try to exploit?
View: Well, yes, as a matter of fact. They may, for example, try to rebrand as simply anti-human trafficking groups, or they’ll try to use misspellings, where they spell Q [as] C-U-E, instead of Q. Facebook says that they’re aware of how QAnon messaging changes very rapidly. They’re going to try to react to that. But we’ll see how well they do in the coming months.
Scott: I think a lot of people are really stunned by the spread of these really bananas theories that this group is spreading. How do you think tech has enabled it, and what can they, we, regulators do to try to stop it now?
View: One thing that social media companies could do would be [to] stop deciding that the metric of success is purely quantitative in terms of interactions and engagement and likes, because this simply pushes people who are radicalized into these rabbit holes in which they get even more radicalized. So they need to take more seriously the qualitative impacts that their algorithms and their platforms are having on people.
Scott: And do you see this as a big step in that direction? Do you see Facebook actually trying to root out even the ways that QAnon groups may try to get around its new ban?
View: I can’t say I have a whole lot of confidence in Facebook’s ability to enforce their policies consistently based on the track record. But when you look at the amount of pages that were recently banned as a consequence of this new policy, it seems to be quite substantial. So I’m hoping for the best.
Scott: Do you think that platforms sort of legitimize them in a way?
View: The tech platforms legitimized QAnon in some ways, by allowing it to go mainstream. For example, Marjorie Taylor Greene, the QAnon-promoting congressional candidate in Georgia who is almost certainly going to win the election, be a member of Congress in the next year, she promoted QAnon through Facebook live streams. So that was just part of the way that she promoted her views, and now she is going to Congress. So yeah, I do think that these platforms and their lax enforcement have legitimized and help mainstream the movement.
Scott: Facebook has made it really easy to promote any cause or small business. Do you think it’s too easy? Do we need a little more checks and balances?
View: That’s the power of social media. [It] can spread any kind of message, whether it’s political or whether it’s related to your particular business’s brand. And that can be very helpful if you’re just starting up a new business, but it can also be very dangerous if you’re promoting extremism.
Related links: More insight from Amy Scott
Etsy also announced this week it would remove all QAnon-related merchandise being sold on the platform. We reached out to Amazon to find out if it’s planning to do the same. The response was “no comment.”
Travis also mentioned that Facebook has been slow to pull down videos promoting QAnon. In fact, moderating live video has been a challenge for Facebook, and other social media platforms, for years now. The AI isn’t good at spotting content that violates the rules and, according to a piece in TechCrunch, Facebook’s army of human reviewers has been reduced during the pandemic.
And it’s not just Facebook that’s trying to purge QAnon. Washington Post Opinion Editor Drew Goins noted on Twitter that the biking service Peloton has wiped hashtags related to the Q movement from its platform, where users can join groups and track their exercise together — because evidently nowhere on the internet is free from conspiracy theories.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.