There’s an information war going on, too
Mar 8, 2022
Episode 615

There’s an information war going on, too

HTML EMBED:
COPY
And Ukraine is winning. On today's show we'll talk misinformation, disinformation and how you can make sure you don't fall for either.

There are no winners in the war in Ukraine. Civilian casualties are mounting as Ukrainians resist Russian invaders. Meanwhile, sanctions from the West have crippled Russia’s economy, as Americans and European allies worry about rising oil prices and cyberattacks.

But on social media, things are playing out differently.

“The information war is pretty clearly being won by Ukraine right now,” said Laura Edelson, a misinformation researcher at New York University and co-director of the Cybersecurity for Democracy project. “A very old part of war is telling your side’s story to win public support and to erode the enemy’s morale. But everything is sort of amplified when it comes into the online space.”

Ukrainians — from regular folks up to and including President Volodymyr Zelenskyy — have been effective communicators, Edelson said. Meanwhile, Russian disinformation has been effectively countered by the global powers aiding Ukraine.

We should define some terms here. Misinformation is just information that is not factual; you might find and spread it inadvertently. Disinformation is misinformation that’s purposely spread to do damage. Russia wields disinformation campaigns, of course, but we’re also seeing social media users misrepresenting videos and photos, like the debunked “Ghost of Kyiv.” All of that is shaping our understanding of this conflict.

Today Edelson makes us smart on the current state of the information war, where platforms have fallen short in protecting users and what regular folks can do to protect themselves instead.

Later on, we’ll talk about eggcorns, rising gas prices and the anti-lynching bill Congress somehow just passed. Plus, a listener sends in a hilarious, festive answer to the Make Me Smart Question.

Here’s everything we talked about today:

Give today to support Make Me Smart.

Make Me Smart March 8, 2022 

Note: Marketplace podcasts are meant to be heard, with emphasis, tone and audio elements a transcript can’t capture. Transcripts are generated using a combination of automated software and human transcribers, and may contain errors. Please check the corresponding audio before quoting it.

Kai Ryssdal: Hey, everybody, I’m Kai Ryssdal. Welcome back to Make Me Smart, or just welcome. If this is your first time here, thank you, where you been? Here’s what we say around here. None of us is as smart as all of us. That’s the whole drill with this podcast.

Kimberly Adams: And I’m Kimberly Adams, it is Tuesday, which for those of you who are new means it’s time for us to do a deep dive, or shall we say, a deeper dive into one particular topic. And today, we are going to talk about disinformation and the way in which Russia has used it in as one of its tools of war and its invasion of Ukraine, and also just the way information warfare is playing out in this conflict.

Kai Ryssdal: Yeah, that last little bit is key, right? Because this is, in addition to the horrible things that are happening actually on the ground, there is a very calculated on both sides, by the way, information war that is being fought. So some terms, misinformation, information that is not factual it is just wrong. Disinformation is misinformation that is intentionally spread to be wrong, and to mislead, and to get people to believe things that aren’t so and as I said, there are things being done on both sides that we sort of need to interrogate, to talk about what’s actually happening and what’s not. And that’s why and how it’s shaping our understanding of this entire war.

Kimberly Adams: So that is what we’re going to talk about today, the state of that information war, and what tech companies and governments say that they’re doing to combat this mis – and disinformation, and whether it will work in the way that we think it will. And so here to make a smart about this is Laura Edelson, misinformation researcher and co-director of the Cybersecurity for Democracy Project at NYU. Hey, Laura.

Laura Edelson: Hey, thanks for having me.

Kimberly Adams: So at this point, if you had to pick, who is winning the information war?

Laura Edelson: Well, the information war is pretty clearly being won by Ukraine right now. Which might be a little bit different from the ground war, depending on what day it is.

Kai Ryssdal: Say more about that, right? Because it’s, you got some personalities involved. You’ve got these are the good guys and the bad guys. I mean, there’s a lot going on in the information war.

Laura Edelson: Yeah, I think it’s it’s really been fascinating over the last couple of weeks watching all this play out, because, you know, historically, Russia is one of the, you know, really long standing, very practiced spreaders of disinformation, you know, they are an actor that we have watched do this for many years now. And I’m sure everyone is familiar with the fact that, you know, Russia spread misinformation, disinformation, and attempted to meddle in the 2016 US presidential election via Facebook ads. You know, I think that, you know, that was not the first or the last time they tried to engage in this, you know, but one of the things that was really quite honestly fascinating to watch about the first week of this conflict, is the fact that, you know, one, the US government was really proactive about making sure that there was factual information out there to the public, but what was going on in Russia and what was going on in Ukraine. And I think that created a climate where the sort of normal tricks the disinformation tricks of creating a false narrative and clouding the facts and misdirection. It was just much harder for that to take hold, because there was more factual information out there. The other thing that’s been really fascinating to watch play out, is just how effective the Ukrainians have been at getting their narrative out there, you know, obviously, President Zelenskyy has, has really been an incredibly powerful communicator. And right, it’s not just communication, he has also shown really tremendous personal bravery. And those two things together, have, you know, really, given the world a very clear picture of who the Ukrainian people are. And, you know, they’ve been reinforcing this message, you know, via social media in dozens of ways, you know, by making it clear that they are going to fight for their country, you know, but also that they, you know, that they’re human and they’re caring. And they’ve been showing the second part in lots of ways from, you know, from cat videos, to, you know, videos of Ukrainians, helping captured Russian soldiers, you know, call their moms. I mean, I think this has been an incredibly powerful message that you know, we all are receiving and all want to support.

Kimberly Adams: I heard you say narrative several times there. And, you know, this idea of propaganda and probably versions of disinformation and misinformation have been around in warfare forever. But how important is this concept – and how different is this concept of narrative shaping?

Laura Edelson: You know, I think it is a very old part of war is telling your side’s story, to win public support and to erode the enemy’s morale. But everything is sort of amplified when it comes into the online space. And I think that is really powerful. So, you know, I think one really concrete thing that we saw in that first week, is, you know, I think, you know, if you’d said a month ago, that governments, like Germany, like Japan, like all of the countries in Europe would be sending Ukraine, not just defensive things like helmets, but weapons, I think many people would be very surprised, you know, but again, that first week, because of the narrative that the Ukrainian government, you know, made sure was out there. I think that that was incredibly powerful for rallying support. And they really were very effective at using online communication channels to do this. Right. Everyone has seen President Zelenskyy’s selfie videos, they are incredibly compelling. You know, I think that, you know, you mentioned propaganda, and that is absolutely a piece of this. You could definitely call a lot of what, you know, a lot of what the Ukrainian government has been spreading has been propaganda, but I think they are, at least, they are attempting to walk that line of staying on the side of, of propaganda that, you know, is at least somewhat factual, they’ve definitely stepped over that line in a couple of circumstances where I think they’ve gone too far. And, you know, spread content that is out of context, or sometimes false, but I think for the most part, they have been, you know, they’ve just been really trying to get their story out there. And they’ve been pretty effective at that.

Kai Ryssdal: My guess is that Zelenskyy, and whoever is advising him on this knows that they’re doing pretty well, in the information ward. Do you suppose that Putin and whoever’s advising him if anyone knows that they’re getting whacked in the information war?

Laura Edelson: Absolutely. You know, I think you have to think about what these relative governments what their goals are, for the information war, you know, something that was very, very important for Ukraine to just keep holding on, was to get various kinds of military support from the rest of the world. And they’ve clearly been effective at that. And that’s why a lot of their their efforts, you know, were aimed at, at Europe, they were aimed at, you know, the rest of us who live in liberal Western democracies, who might be a base of support. And they have also been trying to communicate, of course, to, you know, the Russian speaking diaspora, and to Russians, right, to encourage them to not support the war, you know, and to, and to call on their leaders to end this. Russia has a different set of objectives. You know, I think it is really hard for anyone to look at this conflict, and say that it was rational for, you know, for Russia to take this on, you know, obviously, it has not gone to plan for them, it’s been incredibly costly for them in terms of dollars, and in terms of lives lost far more so than then they probably expected. And so I think that, you know, right now, they need to – their objectives are more about figuring out how to survive, things like sanctions, you know, things like the ongoing reputational damage to their country. So I think for them, you know, they are trying to figure out how they can solidify the relationships they have left, which are more based in frankly, Asia and Africa. So, you know, if I were Russia, those would be the the arenas that I would be really thinking about the information war within.

Kimberly Adams: Can you step back for a second and tell me, tell all of us what the mis – and disinformation looks like, in this conflict? I mean, is it made up stories, is it you know, like purposefully false narratives? Like can you give a little bit more context to that?

Laura Edelson: It’s a little bit of everything, right? One of the things that we know that the Russians were were planning on, was staging an atrocity, right? That’s one end of the spectrum, you know, literally a, a wag the dog scenario where they were planning to, you know, create a, a false video of a, you know, of an atrocity that had been, you know, theoretically committed by the Ukrainians that was entirely made up. And you know, that that is something that that we were very effective at pre-bunking. You know, the US government was was effective with that, I should say, you know, so that’s one end of the spectrum. The other end, you know, is doing, frankly, small things like convincing Russian speakers that the war is going better than it is by doing things like, you know, finding videos of, you know, Ukrainians saying, you know, “yes, we’re glad you’re here,” and then pulling them out of context saying, you know, this isn’t happening in Donbas, it’s happening, you know, it’s happening in the western part of the country. Right. And there’s a lot of middle ground too. It’s very much around, you know, figuring out what are the conclusions that you want people to come away with? And then feeding them content of one kind or another that helps them support that idea.

Kai Ryssdal: Can we talk platforms here for a second? Because we’ve been talking governments, right, the Russians and the Ukrainians? What about the big tech companies? The Facebook’s and the Twitter’s? First of all, what do you think their responsibilities are in a situation like this?

Laura Edelson: That’s such a tough question. You know, I believe that platforms have a responsibility to maintain, you know, the spaces that they curate, I think Facebook has a responsibility to root out harmful content, and that includes disinformation that’s on its platform, so does YouTube, so does tick tock. You know, however, this is something that platforms have historically been really, really bad at. And I don’t know that I can really expect them to, you know, this is the time they’re going to do better, you know. But with that said, you know, I will say that they did take some early steps that were really positive, you know, so I think that the, you know, the banning of major Russian state media outlets, was a really, really important and useful first step that they’ve been hesitant to take in the past. Even though as I said, you know, Russian state media is a really well known spread of disinformation and has been for years. So taking those sorts of big steps early, was really, really helpful. That said, I know, I am really concerned that the major platforms, you know, will be able to maintain the attention on this conflict, and on continually rooting out the misinformation that that pops up, you know, because taking those those big bold steps early is relatively easy. But maintaining your monitoring for, you know, for Russian attacks, that’s going to require investment over the course of months or possibly even years, that, you know, platforms, maybe we don’t know that they are willing to take those steps. And I would hope that they, you know, come out and talk about that a little bit more and be a little bit more transparent. You know, about what kind of steps they are taking to protect their users and their platforms from, you know, the kind of disinformation we’ve seen from the Russians in the past.

Kimberly Adams: Well, then for the rest of us, as we’re scrolling through, what should we be keeping in mind, especially since you’re not super confident the platform’s are going to do what they need to do?

Laura Edelson: Yeah, so what I always try to tell people is, you know, first of all, this question is a bit like being asked, like, “hey, my car doesn’t have a seatbelt. So what should I do to protect myself in a crash?” You know, well, you shouldn’t have to your car should just have a seatbelt. But given that it doesn’t, there are some things you can do. You know, first of all, there are certain things that you should always should always send up your, you know, should send up your radar, if you see a post that is making a really strong appeal to emotion, whether that emotion is anger, or rage, or, you know, or joy, I would be I would be just a little bit suspicious of that, like, if you’re trying to get me to emotionally react, rather than think about what you’re saying, I’m going to at least wonder why you’re doing that. And then the next thing I would do, once my radar is up, once my radar is up, but probably in general, is look at who is posting, you know, whatever you’re seeing, is it a, you know, a mainstream media outlet that you think has fact checkers, and you think has processes to verify that information is accurate before they post it? Or is it like your uncle? I would check who the sources and and just ask myself, do I trust this source? And if it’s something that like, maybe I question, but I want to know more about, you know, what I might do next is I might instead of resharing the post and saying, you know, what do you think do you think this is accurate? I might just take a screenshot and maybe I would send it to people who I thought had more context more information. You know, but rather than sharing that original post, which might, you know, inadvertently spread misinformation, if you just take a screenshot or if you just, you know, take down the link, and then maybe ask someone about that that’s a way you can get more information without necessarily sharing something that you haven’t verified.

Kimberly Adams: Laura Edelson, who is a misinformation researcher and co-director of the Cybersecurity for Democracy Project at NYU. Thank you so much, Laura.

Kai Ryssdal: Thanks a lot.

Laura Edelson: Thanks for having me.

Kai Ryssdal: Super tough problem. Super tough problem, you know, and I think the longer this thing drags on, and I think it actually is gonna drag on for a long time, the information war is going to become even more important, as important as it was in the early days, I think it becomes more important as it drags on. That’s what I think, because that’s about morale, and spirit and support and all of that.

Kimberly Adams: And I think this difference between misinformation and disinformation gets so muddy, when people want to be supportive of Ukraine in particular. And so, you know, that whole, what was it like the ghost of Kyiv?

Kai Ryssdal: Oh, yeah the fighter jet.

Kimberly Adams: Fighter pilot that was supposedly shooting down all these Russian jets. And it just wasn’t true. Or at least there was no evidence for it. And so people who wanted to be supportive of Ukraine were sharing it and promoting it. And that’s misinformation, but not disinformation, because you didn’t necessarily mean to be sharing factually inaccurate information. And yet, there it is, and careful out there on those internet’s friends.

Kai Ryssdal: Yeah, but but I’ll tell you what, look, here’s the deal. We have all unknowingly shared misinformation. It just happens, especially in this day and age of one click, and it’s out to you know, the entire Internet. And that’s problem. So if that’s happened to you, let us know. Let us know what you think of this conversation. Let us know how you stay safe and stay smart about misinformation and disinformation because it’s, it’s not getting any less important. That’s all I’m gonna say. 508-827-6278 is the phone number 508-UB-SMART. You can leave us a voice memo and make me smart@marketplace.org We’ll get it on the pod and we will share your thoughts. We will also by the way, be right back.

Kimberly Adams: Okay, time for the news fix Kai. You should go first because yours is Russia related.

Kai Ryssdal: Mine is Russia related. So here’s the deal. Everybody has heard this by now by the time this podcast drops, everybody will know that President Biden this morning said we’re not going to import any more Russian energy. About 3% of our oil comes from the Russians about 8% total of of oil and petroleum distillates comes from the Russians. So here’s the deal that will help gas prices go higher. The reason I wanted to say this is not because all y’all will not have heard this because you will have but for something that President Biden said in his announcement this morning, he said I will do anything I can to make sure American consumers I’m paraphrasing, to make sure American consumers don’t have to suffer from Putin’s price hike. Putin’s price hike is a direct quote. And I bring that up for two reasons. Number one, you’re going to be hearing Putin’s price hike in the November elections, like I can’t even tell you because the Republicans obviously are going to clobber the president over high gas prices. So that’s number one, the political rally that’s going to happen. The other reality of this, though, that everybody needs to understand is that there is darn near Jack all, save maybe emptying the Strategic Petroleum Reserve tomorrow, that the President can really, really do in the short term to bring down energy prices. He just can’t. Right. And we need to understand that. And we need to acknowledge what that means gas prices are going to be high producers are going to climb back in because it’s $130 a barrel now, but that’s going to take some time. And this is this is the short term future and we all need to be we need to be clear eyed about that, I guess, is the deal.

Kimberly Adams: And for those who have not heard it yet, I’ll point to Andy Uhler’s piece on your show was it yesterday or the day before?Iit was great because it really laid out why a lot of U.S. producers can’t necessarily maximize tapping into, you know, U.S. oil because like the infrastructure is not there for it. It’s – Andy explained it way better than me. I’m sure it’s going to be in the show notes. But it’s a really useful way to understand why it’s not so easy to just turn on the taps of American oil to balance this out. Okay, so here is my contribution to the Congress does nothing narrative, which is that Congress has done something it took more than 100 years and about 200 failed attempts. But Congress has finally passed an anti-lynching bill, specifically, the Emmett Till anti -lynching bill, which criminalizes lynching makes it punishable by up to 30 years in prison. And I want to make sure I get this right that basically, it says that the bill would make it possible to prosecute a crime as a lynching when a conspiracy to commit a hate crime results in death or serious bodily injury. And, you know, it’s such a, a moment, because there have been so many of these times throughout American history, and the fact that it has been so hard and taken so long to pass this legislation. It’s just a really important moment. And for all of the complaints we have about Congress not doing anything, this is something that they’re doing. Look, I would be astonished if President Biden did not sign it. There are all indications that he will, and it passed with almost no opposition through Congress, now…

Kai Ryssdal: Almost is doing a lot of work there. Because how are you possibly opposed to this bill? Could I just ask that question? Sorry, I, I’m not yelling at you. You know what I’m yelling at.

Kimberly Adams:  I know what you’re yelling at. Yeah, um, but it’s done. Or it’s just about done. It’s about to become law. And I think it’s a it’s just a moment worth marking in American history. So for sure, there’s that.

Kai Ryssdal: Alright, Jayk, let’s go.

Kimberly Adams: Okay, first up voicemail from a listener who had this to say about some of the economic effects of Russia’s invasion of Ukraine.

Jody: Hi, this is Jody in Minneapolis. And I think one thing that is missing from all the discussion about Americans being willing to pay higher gas prices is that there are still many places where the minimum wage is 7.25 an hour, which means they literally cannot buy two gallons of gas. And so this is having a huge impact on folks at the lower end of the income scale. So thanks a lot for everything you do. And hopefully we’ll see more coverage on this in the future.

Kai Ryssdal: Yeah, I mean, that’s absolutely true, right? Almost everything in this economy falls disproportionately upon the lower income Americans out there and and especially now, because they’re the ones who have to drive farther to work right? Or don’t have the disposable income to pay. You know, if you’re here in LA and you need to drive your truck to work or something. You’re paying no joke. 5.49 a gallon. Yeah. So yes, absolutely to everything that Jody said for sure. For sure.

Kimberly Adams: Yeah. And we’re in a moment right now, where if you are wealthier and work in a better paid job, you’re more likely to be able to telework. So when gas prices get too high, you might be able to work from home more often. Which is not available. So yes. All of the things that Jodi said

Kai Ryssdal: For sure. Okay. Last week, we had an answer to the make me smart question where somebody realized that what people were saying was that they were on tenterhooks, not tenderhooks, and I tried to get all fancy and try to remember the grammatical term for what that is. Lots of you wrote in. Thank you all. And here’s just one.

Bruce: Hi, this is Bruce, I’m calling from Alameda, California. And I just listened to today March 1st’s podcast. At the end, you were looking for the word that describes kind of an alliteration and the word is eggcorn, E-G-G-C-O-R-N, a word or phrase that is a seemingly logical alliteration of another word or phrase that sounds similar and has been misheard or misinterpreted. It was actually a dictionary.com word of the day on Friday, Feburary 9th and it popped into my head, too. Thanks for reminding me.

Kai Ryssdal: Wow. Awesome, Bruce. Thank you. And thank you everybody who wrote in that was a good one.

Kimberly Adams: Yeah, yeah. Okay. I’m looking at the producer note here, which says that you might also have been thinking of what does that, mondegreen?

Kai Ryssdal: Mondegreen I think is the word, yeah.

Kimberly Adams: Mondegreen which are words or phrases that are misheard in songs or poems. And there’s a Time story that they’re going to link to on the Show page for people who want even more detail on these verbal gymnastics.

Kai Ryssdal: Make me smart, that’s what we do.

Kimberly Adams: We are still looking. Yes. We are still looking for your answers to the Make Me Smart question, which is “what is something you thought you knew, but later found out you were wrong about?” Keep sending those to us as a voice memo to our email it makemesmart@marketplace.org Or leave us a message at 508-827-6278. That’s 508-UB-SMART.

Kai Ryssdal: Here is today’s answer to that question. And just just to protect everybody who listens to this, if you’ve got small kids, it’s not bad, but trust me on this one, you’re gonna want to just like, turn off the podcast for the next like 30 seconds, okay, we’ll be right back.

Bishanna: Hi, my name is Bishanna and I’m calling from Franklin, Tennessee. And something that I thought I knew and later found out I was wrong about is that reindeer were make believe. So I just assumed reindeer, the North Pole. Anything that had to do with Santa Claus was all make believe. And then at 30 years old, I moved to Alaska, and found out reindeer were actually real. I discovered this in a diner when I saw reindeer sausage on my menu. And I asked the server what it was made of. And they said reindeer. I said Yeah, but what’s it really made of? And they were like reindeer. And I said, But reindeer aren’t real. So what’s it made of? And he’s like, uh, reindeer are very real. So yeah, that is something that I thought I knew and later found out I was wrong about.

Kimberly Adams: That’s amazing.

Kai Ryssdal: Well, yeah, no reindeer. Yeah. Like they’re up. They’re up in Finland and Lapland and all that good stuff. Yeah, totally. They’re totally real, totally real.

Kimberly Adams:  I have to admit, when someone first described narwhals to me, I was very skeptical.

Kai Ryssdal: Yeah, there’s all kinds of stuff out there. That’s really you don’t think it is. Super quick update on the way out here for those of you who signed up for the Make Me Smart newsletter or any other marketplace newsletter when we were doing that little giveaway last month. Thank you for signing up the three of you who have won that t-shirts of me from back in the day when I was young and handsome and all that stuff. You have been notified already. I’m going to be signing those shirts next week and we’ll get them out to you but thanks everybody for signing up for that newsletter. It’s Tony and the gang worked hard on it and it’s always good stuff. Ellen Rolfes as well. Sorry, Tony. I don’t even know what Tony does these days. Something.

Kimberly Adams: Tony does everything but I don’t think that any more.

Kai Ryssdal: Alright we’re done.

Kimberly Adams: That is it for us today. We will be back tomorrow with Whaddya Want to Know Wednesday.

Kai Ryssdal: Make Me Smart is directed and produced by Marissa Cabrera today with help from Tony Wagner. There’s Tony I know he does something. The team also includes producer Marque Greene and Ellen Rolfes who writes the newsletter. Ellen I’m really really sorry. Tiffany Bui’s our intern.

Kimberly Adams: She’s gonna get you. Today’s program was engineered by Jayk Cherry with mixing by Bekah Wineman. Ben Tolliday and Daniel Ramirez composed our theme music the senior producer is Bridget Wagner. Donna Tam is the director of On Demand and marketplaces vice president and general manager is Neal Scarbough. And you know, I don’t think you should link the young and handsome Kai. Like, you know, I’m sure they’re variations in all of those categories.

Kai Ryssdal: Thank you, you’re my new favorite person.

None of us is as smart as all of us.

No matter how bananapants your day is, “Make Me Smart” is here to help you through it all— 5 days a week.

It’s never just a one-way conversation. Your questions, reactions, and donations are a vital part of the show. And we’re grateful for every single one.

Donate any amount to become a Marketplace Investor and help make us smarter (and make us smile!) every day.

The team

Marissa Cabrera Producer
Bridget Bodnar Senior producer
Tony Wagner Digital Producer
Marque Greene Associate Producer