Donate today and get a Marketplace mug -- perfect for all your liquid assets! Donate now
YouTube will disable recommendations for some users. Will that decrease harmful content?
Aug 21, 2023

YouTube will disable recommendations for some users. Will that decrease harmful content?

HTML EMBED:
COPY
The platform’s change of policy represents the first “off switch” for potentially harmful homepage recommendations, says Katie Paul of the Tech Transparency Project. She’d like to see controls applied to up-next videos and across social media.

Warning: This episode contains references to guns and gun violence.

YouTube’s recommendation algorithm has always been key to keeping users on the site. Watch a cute cat video, and the platform spews countless more of the same. But that applies as well to harmful content, which the YouTube algorithm sometimes serves up not just to adults, but also to kids.

Well, this month, Google-owned YouTube said it’ll stop displaying recommended videos to some users who have turned their watch histories off.

Marketplace’s Lily Jamali discussed this with Katie Paul, director of the Tech Transparency Project, an organization that says it “seeks to hold large technology companies accountable.” Paul, who especially focuses on the well-being of youthful users online, said controls on the recommendation algorithm on the site’s homepage are vital. The following is an edited transcript of their conversation.

Katie Paul: The homepage content is something that we have examined at Tech Transparency Project and found some really concerning examples of the type of content that is pushed on the homepage through the recommendation algorithms that doesn’t seem to always correspond with watch history. And that’s something that makes this a really important change because it does give the option to stop those potentially harmful recommendations from appearing right on the homepage when many people open up their app.

Katie Paul (Courtesy Violetta Markelou)

Lily Jamali: And in May, your organization published a study highlighting how young gamers were being fed videos on YouTube about guns and school shootings. How did the platform’s algorithm play a role in that?

Paul: The study that TTP did in May was specifically to look at how algorithmic content is pushed to children. All of the videos that these accounts were trained with were video game videos. But all of them were pushed some level of gun content and content glorifying school shootings, which is really concerning, given the fact that we know none of the watch history for any of these accounts contains anything related to real-life weapons, real-life crime or extremism of any kind.

Jamali: And what was your response when you learned this was happening?

Paul: Given the fact that these were recommendations pushed to accounts for minors, including 9-year-old accounts that had been set up through a parent account, what we found was that regardless of any features a parent may turn on at the time, there was no off switch for the algorithms. So even the most well-meaning parent who created an account for their child, who turned on all of the security, still wasn’t able to do anything to stop the potentially harmful algorithmically recommended content. And that was one of the things that we highlighted in our report, that there is no off switch for the algorithms. And so this change from YouTube now appears to be the first option to provide what is essentially an off switch, at least for the recommendation content on the homepage, which is where our particular research gathered all of its findings from.

Jamali: Can you help us understand how important kids are to the bottom line for YouTube and its competitors?

Paul: One thing that’s important to keep in mind is that kids are a major profit driver for big tech companies. And we’ve seen both YouTube and Facebook face historic fines from the [Federal Trade Commission] as a result of their illegal collection of data on minors. And so having such a large swath of young people using a platform that can really affect their emotional state, that can affect their point of thinking on these young, impressionable teens, is one of the reasons there’s such a need for more safety measures when it comes to some of this technology, particularly in lieu of congressional action that seems to be stalled time and again, even when it does deal with children.

Jamali: What was YouTube’s response to your study?

Paul: Well, YouTube’s initial response to our study was essentially to attack the methodology and highlight the fact that there wasn’t enough detail as to whether or not parental controls were turned on, which is why we specifically highlighted that there are no parental controls for algorithmically recommended content. And it seems to be a gap that YouTube is at least in part trying to close with this new change.

Jamali: So do you think this change could be a big step towards stopping that problematic content from being recommended to kids?

Paul: This is certainly an important step in at least reducing the risk of harmful content recommended to minors. But it’s not a one catch-all solution because there are still other forms of algorithmically recommended content on YouTube’s platform, particularly when you look at the up-next videos. So for instance, if you are looking at something that is a scene of school shooting content from a movie, which is one of the types of pieces we found that was pushed to the teen account, watching that video, if you look at the up-next panel, will generate even more recommendations for isolated school shooting scenes from movies. So there is still a risk of increasingly harmful content in those up-next recommendations, which do not yet have an off switch. But when it comes to what is pushed on the homepage, which is where everybody lands when they go to youtube.com or when they open their app, this is certainly a big change in terms of how that content is pushed. But there’s still a long way to go when it comes to recommendation algorithms on YouTube and just across social media more broadly.

Jamali: What more would you say needs to get done to really address this issue as much as one can, given the realities of how the internet works?

Paul: Well, for starters, I think that — especially in the context of minors — the algorithmic recommendations should be turned off by default. But there’s also a bigger issue of lack of transparency when it comes to recommendation algorithms. We really don’t see any way to look inside that black box, and unfortunately without more transparency from tech companies, we have no way of truly understanding how harmful those recommendation engines can be.

More on this

We reached out to YouTube for comment on all this. They pointed us to a YouTube blog post this month that said it’s launching “this new experience” to make it more clear which features rely on “watch history” to provide video recommendations and to streamline the interface for users who prefer to search rather than browse recommendations.

As Katie Paul pointed out, researchers in the Tech Transparency Project study created four YouTube accounts. They represented themselves as boys ages 9 to 14, all with a watch history of videos about Roblox, Legos and first-person shooter games like Call of Duty. You can read the whole study on how her group used those accounts to test YouTube’s recommendation algorithm. It also includes graphic screenshots of the kind of content that YouTube recommended to these test accounts.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer