❗Help close the gap: We still need to raise $40,000 by the end of March. Donate now
Marketplace Tech Blogs

Can any person (or robot) put an end to fake news?

Amy Scott and Kristin Schwab Oct 6, 2017
HTML EMBED:
COPY
GERARD JULIEN/AFP/Getty Images
Marketplace Tech Blogs

Can any person (or robot) put an end to fake news?

Amy Scott and Kristin Schwab Oct 6, 2017
GERARD JULIEN/AFP/Getty Images
HTML EMBED:
COPY

After this week’s mass shooting in Las Vegas, both Google and Facebook promoted false news. Google, for example, featured a story from the anonymous messaging board 4chan, which identified the wrong person as the shooter. It’s a stark example of how quickly fake news can circulate, especially when a story breaks.

How much responsibility should we be putting on tech companies to vet news sources? And how can we spot a fake story ourselves? Marketplace’s Amy Scott talked with Claire Wardle, director of the non-profit First Draft News, which provides guidance to social media sites in how to find and verify content. Below is an edited transcript of their conversation.

Amy Scott: Is it all by algorithm or are actual humans vetting the stories?

Claire Wardle: Well, many of us probably remember that there was a bit of a hoo-hah last year when Facebook did actually have humans looking at their trending topics. Somebody who had worked there was making the case that perhaps those staff wasn’t as impartial as they should be. There was an investigation. Mark Zuckerberg looked at it and said it’s safer just to do this algorithmically – and the same with Google. And if we think about scale  if we think about all the countries in the world, all the different languages, how you make these kind of editorial decisions  they would have to hire a large number of staff to do this. So from a scale perspective, algorithms make sense. But as we’ve seen in particular this week, and we’ve seen it a number of times around big news events, humans are the people who need to make these kinds of nuanced decisions around news. Algorithms can take us so far, but on these kind of sensitive subjects that happen quickly we need human brains to make these sorts of decisions.

Scott: So how are you advising Facebook and Google and other platforms to combine that robot with human approach.

Wardle: What we’ve seen is the platforms for the past few years have talked about themselves as self-cleaning ovens. Yes, everything is not going to be right at the beginning, but over time the crowd will help change the course of the information so we get back on track. The problem is if we have two to three to even four hours of misinformation circulating now, we have real problems. And I think the other thing we have to recognize is that there are people trying to game these algorithms. So they’re trying to use things like Google bomb the algorithms by repeating the same word so many times that it will shift it up the ranking order. So we also have to recognize now that there are forces who are thinking about these algorithms and trying to reverse engineer them to ensure that the information they want to circulate, often disinformation, is what we see at the top.

Scott: So how do algorithms learn? Is it just a matter of trial and error? Do they have to make a lot of mistakes before they figure out how to better identify real news?

Wardle: Algorithms are based on learning over time and recognizing patterns. So certain news events, for example hurricanes, we now know every time there’s a hurricane somebody will tweet an image of a shark swimming up an interstate. We should know that and we should be very aware that these sharks are false. But with lots of news events, we haven’t had a precedent, so that’s why news is difficult. This is why news rooms have to employ many journalists to do these jobs. Within these platforms I don’t think there’s a recognition of how difficult news is.

Scott: I wonder if we need sort of a reset about what we think of as news. For example the 4chan channel that identified the wrong person as the shooter in Las Vegas rose to the top of Google’s top stories in its general search results. And that word “story” implies news. Do you think that in some ways it might be better to go back to the days when there were a few respected, well-established news sources appearing on these sites?

Wardle: It’s insane to think that 4chan would have been labeled as a news source. And I do think that labeling is an issue here. How that top carousel box is listed is an issue. And I certainly don’t want 4chan to be in that top carousel. And I think actually Google News, the actual news website, does have kind of a whitelisted list of sites that are deemed as news content. I think the difficulty with this top carousel that happened the other day when people were searching for that man’s name, which was the incorrect name, is it was pulling in from any source of information. There was so little out there, because people on 4chan were trying to push this false information, that’s how the mistake got made. But I do think that if we’re going to start talking about blacklisting and whitelisting, on one sense we can say yes of course just NPR and BBC and New York Times. But there are incredibly informed bloggers who write really rich and important content on their sites. Does that mean that it shouldn’t be surfaced on Google? I think deciding: When we say Google do we mean all the links or do we mean that top carousel or do we mean Google News? And I think there should be a wider discussion about what we mean by those different elements of Google or the same different elements on Facebook, and to have a wider conversation in society about what we mean by these. Nobody wants to suppress information, but these are not easy problems to solve.

Scott: We the readers of news also bear some responsibility. I mean what role do you think just basic media literacy has in this equation? What should we all be on the lookout for when we’re evaluating a piece of news?

Wardle: So unfortunately we’re now in a situation where our information streams are increasingly polluted. And so we need to be skeptical of the information we’re receiving, and particularly be skeptical of our own emotional responses. So if something makes us feel smug or angry or upset we should be even more critical of what we’re seeing because actually the reverse happens and our brains don’t kick in. So we should be looking at and checking, doing Google reverse image searches if we see images from the hurricanes. We should be researching the information we see. But I do worry a little bit about the onus onto the users. I do think that platforms themselves need to be doing more to think through these challenges. And I think that if something comes up saying top story, or if someone sends me an email saying “oh my god the name of the shooter was X,” you put that into Google and it comes up as a top story, I’m going to trust Google because we would suspect that they had done that kind of editorial filtering. So yes we as consumers need to be more savvy, but at the same time, I would really like to see the platforms be thinking more critically about the information they’re surfacing around big breaking news events where information we know is going to be problematic particularly at the very beginning. And I think in those moments that’s when we kick in those editorial teams to say we need extra pairs of eyes looking at this.

Scott: And we’ve seen how hard it is to unring that bell once it’s been rung, when someone share fake news are they really going to share the correction?

Wardle: Well yes we absolutely know that. That no matter how many times you say that’s false it doesn’t travel anywhere near as fast or as wide as the original piece of false content.

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.