Update, 9/21/17: Facebook announced it will turn over more than 3,000 Russian-bought election ads to the Senate and House intelligence committees. According to the New York Times, the company had previously shared some examples of the ads to Congressional staffers, but not the entire ad selection. In a video posted to his Facebook account, Mark Zuckerberg said, “While the amount of problematic content that we found so far remains relatively small, any attempted interference is a serious issue.” The original story is below.
Facebook is under scrutiny again after it revealed that it sold around $100,000 of ads during last year’s presidential election to a Russian firm with a history of peddling in pro-Kremlin propaganda. Plenty of concern has previously been raised over Facebook enabling alleged Russian interference in the U.S. electoral process through the spread of “fake news” bot accounts, and now comes the news of potential manipulative ad buying.
In the latest episode of Make Me Smart, Kai and Molly discussed the implications of potentially having Facebook CEO Mark Zuckerberg testify on Capitol Hill. While Facebook admitted it sold the ads, as The Intercept’s Sam Biddle noted, the admission raised more questions than it answered. Among other things, we still don’t know the content of the ads in question or how many people these ads reached.
Zuckerberg’s own statements on how vital a role his company plays in addressing misinformation and “fake news” has changed. He at first dismissed that Facebook was responsible for having any pull in influencing the election, calling the idea “pretty crazy” two days after Donald Trump won the presidency. According to Zuckerberg, “Voters make decisions based on their lived experience.”
But less than a month later, in an about-face, Zuckerberg went from shrugging off Facebook’s influence to acknowledging some responsibility in a Nov. 18 post. In what may have been an attempt at calming backlash directed at Facebook, Zuckerberg wrote that “there is more work to be done” and listed all the different ways he planned on minimizing the amount of misinformation on the social networking site in the future. Some of the methods outlined in the post included allowing users to more easily flag stories, working with third party fact-checking organizations and cracking down on spam.
Facebook followed through on its promise shortly after. In December 2016, it rolled out new features to help report hoaxes and launched fact-checking partnerships with ABC News, FactCheck.org, the Associated Press, Snopes and Politifact.
“We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully,” wrote Adam Mosseri, Facebook’s VP of product management for News Feed, the scrolling home page that lists status updates, photos, videos, and news articles. “We’ve focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain.”
Since Zuckerberg’s Nov. 18 post acknowledging Facebook’s role in spreading misinformation and its attempts to curb it, Facebook has taken additional steps to address the spread of misinformation on the site, even starting the Facebook Journalism Project, a collaboration with news organizations.
But how serious and how effective all of these methods are might only be known to Facebook. As Politico noted, fact-checkers partnering with Facebook have no way of knowing if their work is decreasing or increasing the spread of stories that are being flagged to them because no internal data is being released. While Mosseri said in April that Facebook has seen less “fake news” on the site — it has provided no evidence. It remains to be seen if Facebook has really cracked down on the type of misinformation that proliferated during the U.S. presidential election, or if it is paying lip service to appease critics.
|It’s Facebook and Google versus the newspaper industry|
|Google and Facebook: headed for a truth war?|
|Facebook wants to see everything you see|