Can 15,000 moderate the content of 2 billion?
Jun 10, 2020

Can 15,000 moderate the content of 2 billion?

HTML EMBED:
COPY
A team of 15,000 workers polices content on Facebook and Instagram. The vast majority are contractors.

Pressure is growing on social media platforms to intervene more against misinformation, hate speech and other content. A new report says a big barrier, especially at Facebook, is that content moderators are outside contractors, and there aren’t nearly enough of them. The report says Facebook needs to double its numbers, to about 30,000, and bring them in-house. 

I spoke with Paul Barrett, deputy director of the Stern Center for Business and Human Rights at New York University, which published that research this week. I asked him how this system came to be at Facebook. The following is an edited transcript of our conversation.

Paul Barrett (Photo courtesy of NYU)

Paul Barrett: The company realized it needed to do something because there was just way too much volume of material flowing across the platform. They did the thing that a lot of companies choose to do, which is they farmed it out and did that to save money. But also because I think of a psychological factor that they just didn’t see content moderation as being [part] of Silicon Valley. It’s not the engineering work or marketing work or the invention of cool, new products. It’s this very nitty-gritty, very onerous work. And I think it felt more comfortable to push it out at a distance, away from headquarters, so to speak, so it’ll be out of sight and out of mind.

Molly Wood: In another version of a Silicon Valley mindset, we have often heard Facebook and other companies say that algorithms can start to take on more and more of this work, and you looked at that as well, right?

Barrett: Yes, absolutely.

Wood: And it doesn’t work?

Barrett: It’s not that it doesn’t work, it’s just not a complete solution. The vast majority of the material that ultimately gets removed is initially flagged [and] identified by AI-driven automated systems. That’s terrific, and I don’t think anybody really objects to that, but you still need in almost all of those cases human judgment to assess nuance and context and so forth. I don’t think we’re going to be at the stage anytime soon where machines are able to bring to bear that kind of human assessment.

Wood: There is also this question of volunteer moderators, which is a system that is in place in private Facebook groups, for example, but also on Reddit or Wikipedia. Do you have any sense about whether that is a model that is more effective or less effective, or more likely than paying another 15,000 moderators?

Barrett: I think that there’s a possibility of doing that when you have relatively narrow communities of users who explicitly share certain assumptions and values, and so forth. I think it’s quite unrealistic when you look at a vast platform — like Facebook’s main platform that has more than 2 billion users across the world — doing that on a volunteer basis. [It] just strikes me as quite unrealistic.

NYU’s Paul Barrett speculates that Facebook farmed out the moderation function in part because of the “psychological factor that they just didn’t see content moderation as being [part] of Silicon Valley.” (Olivier Douliery/AFP via Getty Images)

Related links: More insight from Molly Wood

On Monday, a group of current and former moderators wrote a public letter objecting to Facebook’s handling of President Donald Trump’s post calling for violence against protesters. They wrote about how much of the content they see is racist hate speech and police brutality, but said that Facebook’s nondisclosure agreements, and the fact that they are third-party contractors, means they can’t join the 400 employees who have virtually walked off the job. “We can’t walk out, but we cannot stay silent,” they wrote. All of that doesn’t even get into the thorny issues of volunteer moderation, which has also come into the spotlight this week as companies have made public statements about supporting Black Lives Matter and then were criticized for letting hate speech and racism proliferate on their networks. 

Reddit CEO Steve Huffman voiced support for the Black community, and co-founder and board member Alexis Ohanian resigned and requested that he be replaced with a Black board member. But Reddit users said that was hypocritical at best, and former CEO Ellen Pao said the site “nurtures and monetizes white supremacy all day long.”

Volunteer moderators in Facebook groups have been accused of silencing Black Lives Matter conversations. And a long piece in The Verge notes that the thing about moderation is it really matters who is doing the moderating. When I talked to the CEO of the Wikimedia Foundation back in February, she said that a big part of the problem of moderating a place like Wikipedia is the gatekeepers. If they’re 80% white and male, omission is the best problem you can hope for. 

And then of course there’s Nextdoor, which is a hotbed for racial profiling on its best day and a hellscape of fear-mongering and racist speech in the last two weeks that’s only barely contained by volunteer moderators who themselves may share the hostile opinions they’re in charge of restraining. People of color on the platform told the Verge they’ve had posts taken down altogether for talking about protests or Black Lives Matter or issues of race. In a statement to the Verge, Nextdoor said, “We want your neighborhood on Nextdoor to reflect your actual neighborhood, and therefore being community moderated is important.”

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer