Are YouTube’s excuses for terrible content finally wearing thin?
Jun 7, 2019

Are YouTube’s excuses for terrible content finally wearing thin?

HTML EMBED:
COPY
And where should the platform draw the line?

Like most big tech platforms, Google-owned YouTube has been struggling (or maybe not struggling enough) with how to deal with awful, hateful or violent videos on its network. This week YouTube waffled on how to handle videos from creator Steven Crowder, who repeatedly mocked a journalist for his race and sexual orientation. First YouTube said the videos didn’t violate its policies. Then it went ahead and de-monetized Crowder’s channel, and on the same day YouTube announced new policies for dealing with hateful videos.

Host Molly Wood talked with Julia Wong, a senior technology reporter for the Guardian who said we should think of YouTube less like Facebook or Twitter and more like a company that pays entertainers and benefits from their work. The following is an edited transcript of their conversation.

Julia Wong: They pay their creators. They have a revenue-sharing model where, I believe, it’s 55/45 between the creator and YouTube. They get blue chip advertisers to pay to run their ads against these videos. There is an argument to be made that that puts YouTube more into the position of being a pseudo-employer of its own stars.

When YouTube is paying for content, even if it’s paying for user-generated content, I think that they need to take more responsibility for what it is.

Julia Wong

Molly Wood: Would it help YouTube’s case if they were better able to articulate what their policies are and show some consistency in applying those policies?

Wong: They will argue that billions of hours of video are uploaded every day, and it’s impossible to make human decisions, editorial calls, on every single one of those. I think that that is probably true on the overall YouTube scale. I think that when it comes to the smaller subsection of YouTube videos that are eligible for monetization, I think that they have a much weaker case to be made. Nobody would say to NBC or Netflix or any entertainment company that it’s just too hard for them to watch all of the shows that they put out. That would be absurd. When YouTube is paying for content, even if it’s paying for user-generated content, I think that they need to take more responsibility for what it is.

Wood: I want to go back to this consistency thing. I don’t want to go too far down the Crowder rabbit hole, but in that case, we went and read the standards that would have seemed to apply and it seemed very straightforward. It was like, this is clearly homophobic harassment according to YouTube’s published standards. And then they kind of said it wasn’t. And then they kind of said it was, but then it maybe only was because it also sold T-shirts. Setting aside what they should do, do you really feel like they have a coherent strategy?

Wong: Absolutely not. Almost anytime that you are going to get the degree of scrutiny that they have gotten in this individual case, I think it almost always breaks down. You have a mix of incompetence and negligence.

Wood: Do you think that in this specific case and at this specific moment, does this unwillingness for YouTube to really step in on the Steven Crowder incident and ban an anti-gay YouTuber have anything to do with not wanting to be accused of censoring conservative speech, especially as they’re facing this possible antitrust investigation?

Wong: I’m sure that that is playing into it. In the past several years, we’ve seen the organized right be very successful in playing the refs. It’s a strange and surprising thing, and certainly a strange commentary on where the Republican Party is today, that you would have Sen. Ted Cruz strongly defending Steven Crowder’s right to be on YouTube and aligning the difference between the right for a YouTuber to espouse conservative views when it comes to, say, small government or low taxes versus the right for somebody to just be bigoted and homophobic and racist.

Related links: more insight from Molly Wood

YouTube has a blog post where it announced that it will start more aggressively removing certain types of videos from its network. The company said it will take down inherently discriminatory videos, videos that deny things like the Holocaust or the Sandy Hook shooting, and Nazi propaganda, but said it will try to filter out videos that are useful to researchers or that are talking about policy or legislation, which already sounds pretty complicated.

YouTube will also try to reverse its often-criticized algorithm that tends to lead viewers from even a relatively benign video almost directly to child porn. I am not being flippant here. The New York Times had a story Monday on how common this progression is and how even cute kid videos uploaded by their parents get grabbed by the YouTube algorithm and presented as a sort of catalog for creepers. So YouTube said if someone is watching a video that’s sort of borderline, it will try to counter that video with a more authoritative or better-quality recommendation.

Meanwhile, even as YouTube makes these changes, there’s the political backlash, the narrative that YouTube is censoring the voices of conservative commentators, along with Facebook and Twitter. There was a Washington Examiner op-ed from a former Trump administration person along these lines.

Julia Wong mentioned Sen. Ted Cruz, who said in April that part of the reason he thinks the government should regulate big tech companies is over anti-conservative bias. Last month the White House launched a “political bias” tool that let people report whether they think their social media accounts have been unfairly banned because of their viewpoints.

At one point in our interview, Wong told me she thinks the tech platforms are in some ways damned if they do and damned if they don’t regulate content. It’s tough because it’s a slippery slope. The shootings and the pedophilia, those are the easy calls. After that it starts to get really messy really fast, and then you’re talking about huge powerful companies with huge influence over literally what we see and read and think.

However, after I read this Buzzfeed story last month about the 14-year-old girl with a million followers who tells gay people and Muslims to kill themselves and promotes so-called red pill beliefs that women are inferior, and the story said YouTube gave her channel a “strike” (that meant she couldn’t upload for a whole entire week) I did feel like maybe I would appreciate a bias against that content.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team