Could fake news on Facebook have swung the presidential election one way or the other?
The debate is raging, and despite denying that false information on Facebook influenced voting, the company announced this week that it’ll try to crack down on any financial incentives to creating and publishing false information on its site. Facebook will update its ad policies, which ban deceptive or misleading content, to explicitly include fake news.
And Google, which accidentally promoted a false story at the top of its search results (for “final election count”) just a day or two after the election, said it will also ban fake news outlets from using its AdSense advertising network.
Will the measures work? Maybe. Cutting off the money is a pretty good disincentive — but only if the policies are actually enforced. And enforcing the policies suggests that both Google and Facebook have some kind of technology for determining whether news is fake, and that’ll lead to a host of questions about why they don’t just stop it from disseminating in the first place. And none of that will change the fact that both Google and Facebook have an incentive to keep promoting interesting, click-worthy, shareable content in order to serve their bottom lines, and that doesn’t always lead to the highest quality stuff, as I think we’ve amply discovered.
But how did we get here in the first place?
Let’s start by saying that in the history of modern media, there aren’t many parallels to the way Google and Facebook distribute news.
Neither is, strictly speaking, a news publisher — a distinction that Facebook CEO Mark Zuckerberg has clung to with increasingly white knuckles for years (fun fact: the issue of Facebook as a media company actually started cropping up around 2012, right before Facebook’s IPO, when the company vociferously denied that it had anything in common with the financially shaky “media” label.)
But both are in the business of distributing information and attaching advertising to that information and deriving revenue from increased interactions from that information. The only difference is that they don’t generate the information. Nevertheless, Google and especially Facebook are on the hook for what those stories say —partly as a matter of reach and power, yes. Facebook reaches 67 percent of adults in the United States, according to the Pew Research Center; and two thirds of those adults get some or even all of their news there.
But the other issue is that each company has its own way of tampering with the information you see in order to make money.
Google ranks pages and search results according to its PageRank algorithm, among others, in the hope of making search results more accurate and trustworthy. And Facebook’s News Feed algorithm similarly sorts and assigns computerized rankings to stories to determine what you see when you’re browsing the site, with the hope of creating a collection of posts that will keep you on the site for the longest amount of time.
And in both cases, the more you use the sites and the more ads you click, the more money the sites make.
Plus, up until the changes announced this week, Google and Facebook have allowed publishers to sell ads on their news sites, whether the information is true or not. So there’s also been a clear financial incentive for people to create and distribute any story that will get clicks, sell ads on those stories and reap the benefits.
And thus, the election.
American consumers have proven highly engaged with news about the 2016 presidential election — particularly with news about Donald Trump. According to marketing startup Keywee, stories about Trump got about eight times more likes than stories about Hillary Clinton; the Trump news bonanza has delivered record traffic and revenue for media outlets ranging from the New York Times to NPR to, well, Facebook.
So there was even more incentive than usual to create fake stories that might make money — and if you had an interest in the outcome of the election, so much the better. And now we’re wondering whether these amplification algorithms, these engagement tools, these mysterious chunks of code, might have actually influenced the outcome of an extremely close race.
According to Mark Zuckerberg, that’s “a pretty crazy idea.” According to Google CEO Sundar Pichai, it’s a solid maybe.
He told the BBC on Tuesday that given the extremely tight margins of the election, anything that could swing the vote of “one in a hundred voters” is worth considering. Pichai said Google’s goal shouldn’t be to debate the influence of news, but to work hard on improving its tools to create maximum accuracy and get rid of incentives to game the system.
For both sites, interestingly, the biggest reason to clean up the news comes down to trust, which is exactly the thing that keeps reputable news sources from lying about or overly distorting the news. The thinking is (or at least has been) that if you lie too much, people will stop trusting you and they’ll go somewhere else.
Facebook is skating close to that edge. It’s a pretty crazy idea that Facebook can remove itself entirely from a conversation about the influence of the news it presents, and Mark Zuckerberg should stop pretending to be a disinterested purveyor of interesting links.
Facebook has both hired and then fired human curators to present news, regularly censored everything from photos of breastfeeding women (actually, really just women, a lot, like really, really a lot) to photos of burn victims to historical war photos to posts about controversial political issues and potentially even conservative news stories writ large — all while apparently exempting Donald Trump’s posts from its hate speech policies.
The reason the News Feed algorithm exists is to encourage people to visit Facebook and use it as a constant and, ultimately, reliable source for information. Google is its biggest competitor. If Facebook can’t get its act together and clean up the information on its site, we’ll eventually start to see it as untrustworthy and go back to good old search engines. Best-case scenario? Facebook and Google get locked in a truth war. Hey, a person can dream.