Support our non-partisan non-profit newsroom 💜 Donate now
YouTube CEO says its content moderation focuses on what people say, not who they are
Dec 2, 2021

YouTube CEO says its content moderation focuses on what people say, not who they are

HTML EMBED:
COPY
"It's a dangerous line to start characterizing individuals and start saying, 'This person is not allowed,'" Susan Wojcicki said in the interview.

Segments From this episode

The Source Code: Susan Wojcicki

One of the most powerful leaders in the tech industry on regulation, content moderation, and creators.
CEO of YouTube, Susan Wojcicki.
Mike Windle/Getty Images

When it comes to moderating what’s posted online, do you focus on bad people? Or just bad content?

One of the biggest debates in society right now is about online speech and how much power tech companies should have in determining what content comes down, what content stays up and who gets to use the platforms at all.

Some complain that Facebook, Twitter and YouTube are heavy-handed or biased, while others argue the platforms need to be way more aggressive.

I recently spoke with Susan Wojcicki, the CEO of YouTube — which is owned by Google — and asked about her process. The following is an edited transcript of our conversation.

Susan Wojcicki: When we make a decision, we want to make sure that decision is very thoughtful, that we’ve consulted with all different parties, and that once we roll it out, we roll it out in a consistent way that’s implemented for all of our creators in the same way. And I understand there are people on both sides of the aisle who disagree with us, but the fact that you see both sides arguing with us means that we are really striking a balance. And when we do policies, sometimes we’ll consult literally with dozens of different groups to try to understand what are the different perspectives and how can we do that in a very thoughtful way.

Kimberly Adams: Several researchers and some members of Congress have described your strategy that you’ve described here as a little bit of a whack-a-mole — blocking content that violates standards as you see it, for the most part, rather than organizations or people known for posting harmful information, for example, which is something that Facebook does. Why do you take this particular approach?

Wojcicki: So I would say that we are incredibly consistent. I mean, whack-a-mole does not characterize at all our approach. When we make a policy, again, we consult with many different people. We also have the three-strike system. Basically, if somebody gets three different strikes within a certain period of time, that account will become terminated. And of course, there’s some types of content that it doesn’t take three strikes. So if it’s violent extremism or child safety, that’s content that is immediately removed. And, again, we don’t want to necessarily focus on the person, we want to focus on what they say.

Adams: And you think that’s a better strategy — to look at the individual content as opposed to the person or the group? Say, if it’s a white supremacist group that’s posting about flowers, would that still be fine? Because some of the researchers we’ve talked to about this say that what that is doing is providing an off-ramp for people to find more extremist content off of your platform.

Wojcicki: Well, first of all, we do see that groups — I mean, you talked about white supremacy. White supremacy groups do not post about flowers. They usually post about content that becomes violative, and then that content is removed under our hate policy. And if a channel has a certain threshold or has any of those violations, then that channel will be removed because we have a policy against hate. I think it’s a dangerous line to start characterizing individuals and start saying, “This person is not allowed.” Everyone is held to the same standards, whether you’re a politician or you’re an individual, these are the lines, the lines are posted on the internet, and if you cross them, then there will be consequences. We do certainly want to make sure that our policies are not being — people would like to tiptoe up to the policy, right? And we have gotten smarter and smarter and worked with a lot of experts to understand, well, what does that symbol mean? Is there a dog whistle here? Is there a subtext? Is there even an image that’s representative of something that would be violative, that we know is a symbol about something? Is it a song that we know actually is representative of something that would be violative of our policy? So that’s the way that our policies really have evolved to get smarter and make sure that we strike the right balance.

Adams: With so much content being uploaded to YouTube every day, it’s inevitable that some of this moderation work that you’re describing has to be done by machines, automation and [artificial intelligence]. But there’s a ton of research out there that shows there is bias built into a lot of the AI that is being used across many platforms. How do you account for that?

Wojcicki: So, we do use AI for really everything we do. We use it for our recommendation system, we use it for the ads, we use it to be able to find content that is violative. But we also use people. At the end, the people are in charge. So the people train the machines and sometimes there are biases that people have and that will get translated into the machines. We may become aware of that issue, and then, as a result, we’ll need to retrain how our machines work. And so, this area of machine-learning fairness, or AI fairness, is a really important area where Google has done a tremendous amount of work. And we are also working incredibly hard to make sure that the way that we implement our algorithms are fair and equal to everyone. So we make sure that we are working with researchers, third parties, to be able to identify any different issues that are coming up and be able to address them. So if we see it, if we see some type of issue that’s there, we will right away look at it, retrain our systems and figure out how to address that.

Adams: Content moderation, obviously a challenge, you’re working on it, you have a strategy. What is the next big, kind of scary thing that you’re worried about on YouTube?

Wojcicki: Right now, I mean, content moderation has been a huge area for us. And I think it’s something we’re always going to have to continue to be on top of. I think we’re in a good place; in the last couple of years, we’ve put so much work from people, technology, process, experts in place to really make sure that we’re on top of it. But I’m always going to tell you that’s going to be one of my — that’s going to be my top responsibility. And we can never take our eye off the ball. We’re always going to be focused on that. But in terms of other things to worry about — we’re certainly — regulation. We’re certainly very focused on working, because I look across the globe and there are literally hundreds of bills that are in discussion right now, that all have a variety of different ways of impacting YouTube. And we want to make sure that our creators are able to continue to publish and do the great work that they do. I’d also say, you know, there’s a lot of competition. Right now, everyone is talking about video, we see a lot of growth. It’s not just U.S. companies, there are global companies that we’re competing with.

Adams: And y’all are getting into podcasting

Wojcicki: We’re excited about podcasting, for sure. But that’s a place where there’s also many different companies. But we do think it’s a good opportunity for people who are producing podcasting to generate revenue, have more distribution. We crossed 50 million subscribers for our YouTube Music and Premium service, and so we know that users are paying for the service, and the more we can offer more podcasts there, we think that will be a really valuable service for our users. So I have many things that could keep me up at night. But I’m also excited about innovation, and that’s really why I ultimately came to Google and to YouTube, just the ability to continue to create and use technology to improve our lives. And that’s what I’m hopeful I can do more of in the coming years.

A black screen shows the Google and YouTube logos
The Google and YouTube logos are featured at the entrance to Google offices in Los Angeles. (Courtesy Robyn Beck/AFP via Getty Images)

Related links: More insight from Kimberly Adams

The whole interview with Susan Wojcicki is available on our website, if you would like to hear.

Also, we have a link to those YouTube Community Guidelines she mentioned.

Plus Facebook’s standards for Dangerous Individuals and Organizations that I referenced in the conversation.

And I mentioned that some members of Congress referred to YouTube’s content moderation strategy as whack-a-mole. That was from a letter a couple of dozen members of the House sent to Wojcicki and Google-Alphabet CEO Sundar Pichai back in January after the Capitol insurrection.

In that letter, representatives called for YouTube to permanently change its recommendation systems, including “disabling auto-play by default and ceasing all recommendations of conspiratorial material on users’ homepages and alongside videos.”

And if YouTube can’t figure out how to do that, they said, it should halt recommendations altogether until it can.

The future of this podcast starts with you.

Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer