Though the CEO insists that Facebook is not a media company and that blaming Facebook for the outcome of the US election is "pretty crazy," he has promised to fight misinformation on the site.
"The bottom line is: we take misinformation seriously," he wrote in a November 19 Facebook post.
"Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information. We've been working on this problem for a long time and we take this responsibility seriously. We've made significant progress, but there is more work to be done."
Projects are underway already to improve the site’s ability to classify misinformation before it even reaches the public and to make reporting by Facebook users easier after the fact.
The company also says it plans to learn from fact-checking organizations, is exploring ways to label stories that have been flagged as false when people share them and is raising the bar for the quality of stories that appear as "related articles" under links in users' news feeds.
Zuckerberg also noted that misinformation is often driven by financially motivated spam. “We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.”
"Historically, we have relied on our community to help us understand what is fake and what is not,” Zuckerberg said. “Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others – like people sharing links to myth-busting sites such as Snopes – to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it's much less likely to spread."
However, he said, Facebook believes in giving users a voice, “which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.”
Facebook is not the only company attempting to fight misinformation: Google also announced earlier this week that they would begin censoring websites they believe to be misleading.
"Moving forward, we will restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher's content, or the primary purpose of the web property," Google said in a statement.
However, Rawstory argues that simply exposing individuals to "real" news does not necessarily make them better informed. Rather, preexisting political identities prevent people from accepting information that clashes with what they want to believe. Outrageous claims by President-Elect Donald Trump about voter fraud, for example, were extensively fact-checked and reported on. That didn’t make Trump supporters believe fact checkers. Nor did it seem to prime supporters of Hillary Clinton for an election loss, despite FiveThirtyEight giving her a 71% chance of winning – far from a sure thing, Rawstory points out.
Perhaps the issue is not exposing users to fake news, but simply exposing them to like-minded others. Facebook makes it very easy to find fellow-partisans, helping reinforce the political ideas they already hold – regardless of the evidence.
Other websites say the price to pay for the big three internet gatekeepers – Google, Facebook and Twitter – to begin large-scale censorship is too dear. A widely circulated list of "fake news" websites includes a number of sources of alternative journalism that is not necessarily false, The Anti-Media cautioned in a report. Cutting off their funding and distribution is tantamount to taking out an entire movement.