❗Let's close the gap: We still need your help to raise $40,000 by April 1. Donate now
Election misinformation is still spreading
Nov 13, 2020

Election misinformation is still spreading

HTML EMBED:
COPY
Social media platforms don't exist in a vacuum.

Lies and unfounded allegations about the U.S. election are not going anywhere. And though we talk about Facebook and Twitter a lot, critics say YouTube hasn’t been doing nearly enough to prevent the sharing of videos that make false or misleading claims.

It’s hard to overestimate just how big YouTube is. Seventy-one percent of Americans say they use the website, according to Pew Research, and over a quarter of U.S. adults get news from it.

All this is a topic for “Quality Assurance,” where we take a deeper look at a big tech story. I spoke with Rebecca Heilweil, a reporter for Vox’s “Open Sourced” project. The following is an edited transcript of our conversation.

Rebecca Heilweil: YouTube’s stance with some of the videos that just proclaimed Trump won is that the election already happened [and] it’s not technically interfering in the democratic process anymore. But, of course, other people would disagree with that characterization and say it’s promoting a lot of doubt in the [election’s] integrity. A lot of what average people would probably call election misinformation is certainly not being removed from YouTube. There’s a lot of frustration that this platform has played a really large role in just sowing doubt and creating confusion.

Amy Scott: Another concern is that even though YouTube may be making it harder for people to find that content, that doesn’t stop people from sharing direct links on other platforms, and that’s one way this content sort of slips through the cracks.

Heilweil: That’s exactly right. And in one sense, YouTube is saying, “We are taking borderline videos and taking them away from our search results,” but none of these platforms exist in a vacuum. And [platforms] really aren’t coordinated in the most optimized way to make sure that it’s not jumping from one place to another.

Scott: One of YouTube’s other strategies has been to demonetize videos, so not allowing ads on misinformation. How effective has that been?

Heilweil: A lot of the videos that I saw were not from channels that are on YouTube to make a lot of money. They’re just random videos that sort of allege voter fraud in some capacity, and they amass views. So that strategy might work for a channel that really needs YouTube to survive, but for someone who’s just trying to add to the confusion, it’s not clear like that move would really be particularly helpful.

Scott: It seems like YouTube has escaped a lot of the pressure that Facebook and Twitter have been under — and they have responded by slapping labels on content or removing it. Why do you think YouTube has gotten kind of a pass?

Heilweil: Facebook was the tech story in the aftermath of the 2016 election. And so much of what we’ve discussed regarding tech and politics has centered around Facebook. It’s given researchers and journalists the data tool that makes it easier to get a peek into what’s getting attention on its own platform, and I don’t think YouTube has really been pressured to do the same, which means that you can’t sort of look at it and raise attention about things that are sort of going viral in the same way.

Scott: And there is maybe another test coming for these social media platforms with the possibility of a coronavirus vaccine and misinformation related to that. Do you expect anything different when it comes to preventing vaccine misinformation?

Heilweil: I think something that’s really interesting is YouTube said it will ban COVID-19 vaccine misinformation. It actually said you can’t post something that’s going to contradict the World Health Organization or local health authorities about the nature of a COVID-19 vaccine. When and if one does arrive, obviously, is a lot of attention on that now because of the announcement from Pfizer. But beyond that rule, there are other signs that YouTube is going to take a similar approach to the election and to other topics that get a lot of attention like this — pushing authoritative content, trying to rely on officials to sort of divert people to what they’re saying and pushing fact checks and things like that.

Related links: More insight from Amy Scott

YouTube says the most popular videos about the election in its search results are from “authoritative news organizations,” though it won’t say who qualifies as authoritative.

For its part, Twitter said Thursday it labeled 300,000 election-related tweets as misleading. That’s about 0.2% of total tweets about the election. The move was part of a broader effort to slow the spread of misinformation. The company said that making users take an extra step before retweeting resulted in more “friction” and less sharing overall. The change will stick around — for now. Twitter said, though, it will bring back recommendations for tweets users might be interested in by people they don’t follow, saying that taking away that service did not significantly cut down on misinformation.

Back to YouTube. You can check out Rebecca Heilweil’s reporting on the Google subsidiary and the “slew of content stirring doubt in the electoral process.” From November 3 to 5, she writes, videos on political channels with keywords related to “election fraud” were viewed nearly 100 million times. In two days.

The Pew study I mentioned shows the powerful role YouTube plays as a news site for many Americans. Viewers tend to be younger, more racially diverse and more male than U.S. adults overall. About three-quarters said they expect the news they view on the site to be largely accurate. That’s a lot more faith than people tend to have in news they see on social media generally.

Finally, fans of YouTube might be disappointed in other news this week that the service’s end-of-year retrospective known as “Rewind” will not happen this year. For the past decade, YouTube has released a video celebrating the big moments and trends of the year gone by and paying tribute to its creators. “But 2020 has been different,” the company said in a statement, “and it just didn’t feel right.” Maybe it’s for the best. The 2018 “Rewind” was the most disliked video in YouTube history.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer