Facebook shouldn’t be surprised its groups were overrun with conspiracies
Feb 3, 2021

Facebook shouldn’t be surprised its groups were overrun with conspiracies

HTML EMBED:
COPY
The Wall Street Journal reported that the company has had internal research for months about the toxicity within some groups.

Facebook last month announced it will stop recommending political and civic Groups to its users. The company said users want less politics in their feeds, and it has said it didn’t realize how much its groups were spreading medical misinformation, were being used to radicalize people into QAnon and that they had become one of the home bases for the planners of the Capitol insurrection on Jan. 6.

But this week, The Wall Street Journal reported that the company has had internal research for months, if not years, about private groups being toxic, including some full of calls for violence, yet they were still recommended to Facebook users. I spoke with Renée DiResta, research manager at the Stanford Internet Observatory, who said Facebook’s push toward Groups created a cycle of radicalization. The following is an edited transcript of our conversation.

A headshot of Renee DiResta, the research manager at the Stanford Internet Observatory.
Renée DiResta. (Photo courtesy of Stanford University)

Renée DiResta: What was happening beginning in 2016 was you were starting to see these very conspiratorial communities that were taking shape. And they were being recommended to people who had interest in other conspiratorial communities. And when you started to have this, it was like a correlation matrix where it was saying, “Oh, you’re interested in this wild theory? Well, here, try this one on.” Again, that engagement, highly engaged communities, wildly sensational content, high-volume groups and posts really become a much more significant part of the experience for people who went and participated in those communities.

Molly Wood: Is it credible to you for Facebook to say, “We couldn’t have anticipated how conspiratorial these groups were gonna get”?

DiResta: No. No, not at all. And that’s because there was a Wall Street Journal article that came out last year that said that what the platform’s own internal research had showed back in 2016 was that they were realizing that 64% of people that joined some of what they called more extreme groups were doing it because of prompts from the recommendation engine. For researchers, such as myself, who were seeing it from the outside, we have a very anecdotal sense of the problem. Like, right now, if you were to go to Instagram and follow Robert F. Kennedy Jr.’s account, you’ll see a whole range of recommended accounts that will be suggested to you that are mostly coronavirus-denial accounts. Now, that’s a thing that I can see it’s small-scale, but I really don’t know if that’s a systemic problem or a problem that’s anecdotal.

Wood: How hopeful are you about this move to stop recommending political and civic Groups? How big a deal do you think it could be to untangle the recommendations from the existence of the groups themselves?

DiResta: I’m not sure that a blanket ban on topics is the way to go about doing this. And Facebook has encountered some challenges with the definition of “political” in other product fronts, like ads, for example, where in order to run advertisements that were related to political issues, you had to go and get yourself verified. Now, I think that that’s a reasonable amount of friction, but the question then became what is a political issue? I think that there are plenty of political groups that stay within the realm of healthy behaviors — they’re not being used for organizing violence. The decision to remove them from recommendations just means that people will have to go and proactively search for them. And I think it’ll be interesting to see what impact that has on their growth. I don’t think it’s a silver bullet, though.

Wood: How about the announcement that Facebook will require moderators to spend more time reviewing member posts? Is moderation a better solution?

DiResta: There’s some really interesting evidence from Reddit that suggests that the answer to that is yes. Reddit really works to empower moderation at the local level, giving tools to moderators at the subreddit level to make determinations about what norms and values and standards were appropriate for their communities. And so I think improving moderation tools, and then also putting the onus on people so that if they create these groups that they don’t just let them go haywire and then say, “Oh, I just didn’t know.” If you’re choosing to form a community, it gives us perhaps more of a sense going forward that that choice is something that we’re going to be expected to take responsibility for.

Related links: More insight from Molly Wood

Here again are those Journal stories about what Facebook knew and when it knew it (hint: everything and all along). And it’s obviously still happening. This week, an anti-vaccine protest of about 50 people in Los Angeles forced the city to temporarily close its mass-vaccination site at Dodger Stadium. The Washington Post reports that the protest was organized on Facebook on a page that promotes refusing to wear masks and cleverly links to a debunked, viral, now-banned misinformation video about the pandemic by ever so slightly misspelling its name. A Los Angeles Fire Department official said he also got an ad on Instagram about the planned march and protest in the days leading up to it. Facebook told the Post it would review the page to see if it violated any of its policies. The company did, however, ban the page for Myanmar’s military television network after the military overthrew that country’s government in a coup.

Now, you know I have thoughts on Jeff Bezos stepping down as Amazon’s CEO. Given that Andy Jassy, the incoming CEO, built the company’s cloud services business from nothing to the leader in the space and a $40 billion chunk of change every year, I think you can assume that the future of Amazon is less one-day delivery and a whole lot more cloud.

Here is a column I wrote for Wired almost a year ago on how we need to start thinking about cloud neutrality and what we do when so much of the internet exists at the business arrangement pleasure of a few companies, of which Amazon is the biggest. That’s especially notable in light of Amazon Web Services kicking Parler off its servers. But if we can assume Amazon’s ambition is to be the infrastructure for everything, it is likely to become an even bigger deal over time.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer