YouTube announced Thursday that it has turned off the comments feature on tens of millions of videos that feature young children. This is after a bunch of advertisers pulled out of YouTubebecause of pedophiles flooding the comments sections on videos featuring young children. On that front, this is a very welcome move. But it’s also complicated. Creators on YouTube rely on comments to create their communities. It can make us nervous when platforms that rely on user-generated content show how much control they really have. Host Molly Wood talked with Kelly McBride, senior vice president with The Poynter Institute for Media Studies. McBride says tools used to scan for certain types of content don’t always work. The following is an edited transcript of their conversation.
Kelly McBride: I was looking through a bunch of videos that I’m familiar with because I have children, and they occasionally post things to YouTube. And I could see that not everything was caught by YouTube’s mechanism right now.
Molly Wood: YouTube has promised screening before. Google [which owns YouTube] has, too. Alphabet [Google’s parent] has, as a company, said, “We’ll figure out a way to screen fake news out of the Google News product.” But then it inevitably pops up. Is it going to be that much more infuriating for people if and when these tools don’t work?
McBride: What we can never figure out is, when the tools don’t work, if they don’t work because there are so many guardrails on them so that the tools don’t undermine the core functionality of the platform, which is to keep you on the platform and to keep serving you up content so that they can keep serving you ads, or do the tools not work as well as we want them to because it’s just hard to engineer them and we haven’t advanced that quickly? Unless you have an insider who’s actually working on these tools, telling you which one it is, we never know from the outside. We generally assume the worst, though.
Wood: Right now we might all be happy that anti-vaxxers and pedophiles are being stopped, but it will inevitably lead to a conversation about how much power these platforms have over speech.
McBride: Right, and political content. Facebook got really nervous about that, but that’s ultimately what it gets to — do you have the ability to screen out certain political messages and boost other political messages? And if you have that ability, are you actually doing that?
Wood: Does all of this argue for private spaces? It feels like one of the things that we’re finding is that the idea of lives as content that is publicly available and able to be monetized, and therefore able to be commented on, might just not be a good idea.
McBride: Yeah, the reason that public spaces work is because they can scale and then you can convert them to advertising. But the reason that they don’t work, for much of the public, is because you’re just too vulnerable out there. Too much can happen. It does argue for private spaces, for subscriber-only content. We’re already seeing that happen in local news, where subscriber revenue is becoming more important than advertising revenue, for a completely different reason. But it could be that users get more value out of what they pay for.
Wood: Well then, what about the kids who want to be YouTubers, who think that this is a path to fame? What should we be telling parents or doing as parents?
McBride: The ethics of it, what we let children do, maybe the whole idea of letting children at all become little business people is warped. It’s sort of like when you look at child entertainers and you’re thinking, “I wonder if that kid really is not being manipulated by people around him.”
Wood: Right, and in this case, it’s easy to draw this connection to the fact that YouTube makes money if we exploit our children or ourselves. But in the case of kids, what they are opened up to is so much worse. At least if they’re on television, these comments are not showing up on their shows.
McBride: Right, multiple-way communication is bad for children generally.
Related links: more insight from Molly Wood
The fact that YouTube is disabling comments on some of its videos is, in fact, a big deal and a pretty strong move. It speaks to just how much pushback the platform has gotten over the past couple of years over the experiences that kids have on YouTube. Those range from insanely inappropriate videos being planted on YouTube Kids to accusations that its recommendation algorithm, where it suggests videos for you to keep watching, is designed to addict kids. The algorithm also tends to lead them down a rabbit hole to terrible content really fast. Plus, it keeps rewarding and promoting the world’s worst people, like Logan and Jake Paul.
At the end of the day, there’s not much incentive for YouTube to do anything drastic to the way the product works. It’s supported by advertising, and that advertising depends on views. All these problems are created by the push for views. But considering the moves YouTube has made to clean things up over the past few months, it seems like the angry advertisers might be starting to make a dent in the incentives.
The future of this podcast starts with you.
Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.