The coronavirus pandemic has gone hand in hand with an infodemic of misinformation about everything from homemade cures to whether masks work (spoiler alert: They do). Now, if possible, the misinformation stakes have gotten even higher as the COVID-19 vaccine begins to roll out. Doses began being administered in the United Kingdom on Tuesday, and disinformation researchers say there’s a whole new wave of renewed activity spreading lies about vaccine safety and the origin of the virus.
I spoke with Joan Donovan, the research director of the Shorenstein Center on Media, Politics and Public Policy at Harvard Kennedy School. The following is an edited transcript of our conversation.
Joan Donovan: Over the past couple of weeks, I’ve had several doctors, as well as hospital librarians, asking us for help, because people are showing up having junk science at their back, asking about vaccine safety. And doctors are wondering, “Why is it these questions? Why now at this scale?” We also need to do quite a bit of cleanup online, especially around very key phrases related to the vaccine ingredients and vaccine harms, because we have over a decade now of vaccine misinformation that is just littered about the internet, much of which has been waiting for this moment to burst into the public consciousness.
Molly Wood: And what do you think about some of the tools that the platforms have employed? I mean, the primary one seems to just be this labeling and linking to some vetted resource. You’ve been a little bit critical of that.
Donovan: In some instances, as I click through those labels, they’re not immediately relevant sometimes, or they’re not targeted enough in terms of pointing people towards accurate information, that they’ve almost started to become, at least in the right-wing media ecosystem, they start to become a bit of a joke or a badge of honor to some folks. I think something that it triggers in all of us when we see a label is we assume that things that are not labeled may be more true than they are.
Wood: In some ways, this feels like a little bit of a Groundhog Day conversation — the platform’s are never quite doing enough, and they’re too big, and there is no regulation. And yet, I think we would still agree, though, you and I, that we also don’t necessarily want these platforms fully in charge of deciding what we see, given their scale, right?
Donovan: What I’m interested in really understanding is how are they going to describe and define what counts as misinformation on their platform? And then, instead of retroactively really trying to Band-Aid the situation, what is their design solution that doesn’t allow moneyed interests and people with broad political reach to turn those systems against society?
Wood: So you’re saying there’s a big delta between what they’re doing now and even them becoming publishers in the sense where they’re liable for what goes up?
Donovan: Yeah, and I think that it’s, again, Groundhog Day, but a feeling like a fool’s errand to really try to care deeply about any politician pulling the rug out by taking Section 230 away, for instance, when it comes to trying to get platforms to stabilize how they serve information. That’s what the struggle is here. When [President Donald] Trump is saying that we got to get rid of [Section] 230, it’s because the information ecosystem is so unstable that he can’t wield it to his advantage. But we have to be cognizant of the fact of who’s calling for this removal and under what conditions. But we’re going to have a struggle over large, centralized communication systems in our world continuously. No matter what is done now, the problem is always going to be, when it comes to centralizing our communications, are these platform companies being responsible to the broadest public good? And at this stage, we can demonstrate no.
Wood: From your perspective as a professor, in a weird way it seems like social media platforms at this size are something we also have to build an immunity to. They’re all so relatively new and our brains haven’t figured, or society has not figured out, how to handle this new type of virus. Should curriculum be a part of that? Should we be designing curriculum for schools, for children to teach them how to recognize truth?
Donovan: I think our educators do a fairly good job of that. What’s hard is when you go online and you see a company like Google who’s tasked themselves with organizing the world’s information, but then they don’t show you how they organize it. And so it becomes a really complicated question for younger folks to say, “Well, I have access to more information than any human being in the history of the world. And yet I have trouble finding true things.” Or accurate things — let’s maybe scale it down one philosophical notch.
I do think, though, that we do need a curriculum for understanding it, just as we would teach people the practice of citations and why that matters. We do need to help people understand what it is that they’re seeing when they search for information on any of these platforms. It’s a hard problem, though. I mean, back in my day — not that I’m that old — Wikipedia was considered this big enemy of the university. How could they possibly have so much information on Wikipedia and it all be true? And so teachers would say, “You can’t cite Wikipedia.” But it’s really where we all began our exploration, even if we didn’t necessarily show it. And I think at this point now, we do also have to wonder about the flow of timely, local, relevant and accurate information. And so one of the things that I think we also have to redesign as we think about openness and scale of these platform is to what degree then we also try to open knowledge and try to make sure that the world’s resources, informational resources, scientific publications are also available for people to be able to explore and understand? And by and large, journalists are the ones who provide that window into science, popular science, anyhow, and it’s hard because we’ve seen, also, as we see platforms taking control, we see journalists losing resources and audience.
Related links: More insight from Molly Wood
Facebook said last week it will remove posts that spread debunked vaccine misinformation, but as people like Donovan note, anti-vaccine groups are well entrenched on the platform.
And more great reporting on this beat from Brandy Zadrozny over at NBC News on how the anti-vaccination movement already had plenty of steam on social media and has only picked up more in 2020. One report found that anti-vaxxer groups gained about 8 million new followers since the end of 2019. Instagram and Facebook are the hottest hotbeds. Research published in the journal Nature suggested disinformation about vaccines may have reached and persuaded some 100 million people on Facebook alone. On the very, very faint silver lining side, though, one researcher told NBC News that it actually can be effective when Facebook bans big anti-vaccine accounts or groups, because even though they might have a fit and tell their followers to go find them on Parler or Gab, or wherever, most of the time people don’t because they like it on Facebook.
I also recommend a long read that BuzzFeed published Sunday on how much damage disinformation has done to us as a country in just this year alone. It’s grim, but if you’re sitting in your house right now thinking you simply cannot understand how one side could think what it does and how obvious it is that they’re just wrong and it makes no sense at all, realize that the only sense it makes is if you are literally living in two completely parallel universes of completely different realities. And then know that that’s been happening in this country for a lot longer than just 2020.
Listening makes you smarter…
donating makes it all possible.
Our mission is to raise the economic intelligence of the country, exploring the intersection of the economy, tech, and our daily lives. As a nonprofit news organization, we count on your support – now more than ever before.