What have you always wondered about the economy? Tell Us
Jan 22, 2021

President Biden called for an end to disinformation. Will the internet hear him?

HTML EMBED:
COPY
Misinformation is built into social media's business model, which rewards content that's extreme and engaging.

We as a country are trying to figure out what is true. Or more accurately, whether we can agree on what is true. In his inaugural speech this week, President Joe Biden called for a return to truth and an end to the deliberate spread of misinformation. That happens on social media platforms; in fact, it’s built into their business models, and misinformation influencers abound.

But that’s not the only vector. I discuss this in “Quality Assurance,” where I take a second look at a big tech story. I spoke with Kevin Roose, who covers social media for The New York Times. The following is an edited transcript of our conversation.

A photo of Kevin Roose, a reporter at the New York Times.
Kevin Roose (Photo courtesy of Dominik Gigler)

Kevin Roose: I think you can deplatform all the conspiracy theorists and extremists you want. But if the basic architecture of the platforms and the algorithms is still designed to reward extreme and engaging content over content that is true, I think you’re going to end up replicating a lot of those problems with other accounts down the road.

Molly Wood: What’s our role as members of the media? Because you could argue that we also have been pushed to create more engaging content to get more clicks and ads, especially as we’ve had to compete with social media, and we have less trust.

Roose: For sure. I think that the media is certainly part of this equation. I mean, I think there’s an impulse to cover the outrageous and engaging thing. And I think that in the next few years, a lot of us will have to examine how we react to that impulse, whether we allow it to guide our coverage decisions or whether we decide to focus people’s attention elsewhere.

Wood: Some of us heard President Biden’s inauguration speech and went, “Yes, we do have to return to truth,” and others maybe took that as not true. Do you think it’s making a difference? Is it making a difference to start to have a national conversation about the fact that we live in these parallel worlds?

Roose: I think we have to have that conversation. I mean, I’ve talked to a lot of experts on disinformation and conspiracy theories, who say [that] often people who come out of these ideologies, they don’t just return to viewing the world the way they used to. They just arrive at some other mistruths, some other conspiracy theory. And hopefully, that’s one that’s less dangerous and less extreme, and that might be the most we can hope for. I talked to one woman once who left QAnon. She realized it was all a lie, and I asked her, “Do you still believe that there’s a cabal of global elites controlling the world?” And she said, “Oh, absolutely. I just don’t think this Q person has anything to do with it.” I think it’s a little much to ask people who are this steeped in a conspiracy-theory mindset to return to consensus reality. But we can move some of these people off their more extreme positions and make them less likely to commit offline violence and harm.

Wood: We, just in the past few weeks, got essentially a real-time experiment in deplatforming, which is something disinformation researchers have called for but we haven’t seen really at scale. And I think I saw one study that said disinformation on Twitter declined something like 70% after President Trump was basically banned. Is it going to be harder to make the argument that that’s not the remedy?

Roose: I think it will become clear in the coming weeks how effective those bans were. I mean, President Trump is a special case in that he has tens of millions of followers who will seek him out on whatever platform he’s on. But I do think it’s been quieter. I mean, frankly, there’s been less prominent disinformation, there have been fewer influencers to benefit from spreading it. It’s just been eerily quiet on the internet since President Trump was banned from these big platforms. So I think that’ll continue to play out. I think there will probably be a number of the president’s followers who migrate to some other platform. It probably won’t be Parler but might be something else along those lines. And I think that will accelerate what they call the splintering of the internet into political factions.

Related links: More insight from Molly Wood

First, I feel like we are overdue for a definition, one that I got explained to me a few months ago in a Zoom conference and failed to pass on: misinformation versus disinformation. Misinformation is stuff that’s just wrong, but can be disproven if some fact-checker comes along or someone in the comments posts the real thing. Like someone saying masks don’t work to protect you from the coronavirus or injecting bleach does. It’s disprovable.

Disinformation, however, is wrong information spread on purpose that can’t be disproven. So a big, huge conspiracy theory about pedophile satanists running the world is disinformation, but part of its insidiousness is that it can’t be disproven, because it’s just a lie. The lie about the election being stolen, for example, has elements of both.

Anyway, one of the ways researchers have asked platforms to deal with misinformation and disinformation is to stop amplifying it and deplatforming or taking down influencers and accounts that spread it. You heard me mention this with Kevin Roose. Specifically, a research firm called Zignal Labs has found that after Twitter banned President Trump, misinformation about election fraud plummeted 73% across social media, going from 2.5 million mentions to just 688,000. Researchers will have a field day with this, and obviously, Trump is a unique case because of sheer reach and size of megaphone. But the researchers found that banning tens of thousands of other accounts spreading conspiracy theories led to a massive drop in conversations related to the Capitol insurrection as well.

Speaking of sources for mis- and disinformation, a federal judge in Seattle ruled Thursday that Amazon does not have to host the social media app Parler, which Amazon booted from its AWS hosting services after the app became a home for pro-Trump extremists who planned and celebrated the Capitol assault.

The future of this podcast starts with you.

Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Jesus Alvarado Assistant Producer