In the light of US elections and the Brexit referendum, Facebook has faced allegations that fake stories spread on it's site, influenced the election of President-elect Donald Trump.
Saturday, November 12, Facebook CEO Mark Zuckerberg dismissed accusations that fake news on the social media platform influenced the US election. In a personal blog post, he insisted that 99% of news links shared on the site are legitimate.
"After the election, many people are asking whether fake news contributed to the result, and what our responsibility is to prevent fake news from spreading. These are very important questions and I care deeply about getting them right. I want to do my best to explain what we know here.
"Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics.
"Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other," Zuckerberg wrote.
Two days later, Facebook announced that it updated its advertising policies to highlight that its ban on misleading content applies to fake news. In other words, Facebook hopes to prevent fake news sites from earning money through Facebook advertising.
Glad Facebook got around to cutting off supply of fake news and Twitter stopped producing alt right hate bots before anything bad happened.
— Josh Marshall (@joshtpm) 16 November 2016
That's two denials on fake news in three days.
Keeping the pressure up, a new YouGov poll has found that 72% of Brits believe that social media companies, like Facebook should take responsibility for fake news shared on their sites.
Sorry Mark Zuckerberg, but the vast majority of people DO think Facebook should be removing fake news stories https://t.co/RfauKDXBnD pic.twitter.com/zJmb34bXQu
— YouGov (@YouGov) November 15, 2016
So, why is it so hard for social media to police fake news?
A former Facebook engineer, Lars Backstrom in a blog post in August 2013, explained that it's down to how Facebook's news link algorithm works.
On average, there would be thousands of potential stories a day from friends and people users follow that could appear on each users feed, without Facebook's algorithm.
"With so many stories, there is a good chance people would miss something they wanted to see if we displayed a continuous, unranked stream of information," Backstrom said.
Facebook's algorithm boils this down to around 300 stories, that it "prioritizes" using factors like how often you interact with a friend, page or public figure; how many likes, shares and comments individual posts have received; and how much you have interacted with that kind of post in the past.
The consequences are far-reaching.
Critics of this kind of filtering say that it's creating an artificial news bubble. That public discourse today is not as diverse as it should be, because influential news sources like Facebook are ensuring we mainly just see what we want, and have hidden, what we don't.
This kind of "echo chamber" is significant. Through the US 2016 election and this year's UK Brexit referendum campaign, political pundits liked to use words like "polarized" or "more divided than ever."
It's possible that the social media's news bubble is contributing to this.
A Pew Research study last year, found that 61% of US millennials used Facebook as their most common source for political news. That's 17 points higher than the next most consumed source, which was CNN at 44%. It's undeniable, that for millions of people, Facebook is not just about sharing personal baby photos and inane cat videos, it's a legitimate news source.
Among Millennials, Facebook far exceeds any other source for political news http://t.co/rwsOUHEzt1 pic.twitter.com/7Uu5Md30oL
— Joel Pavelski (@joelcifer) June 1, 2015
Mark Zuckerberg has repeatedly insisted that Facebook is doing enough to prevent fake news from being shared, and so "legitimized" on his platform.
However, many remain unconvinced.
Moving forward, Facebook could try and emulate Google News, with more sophisticated vetting algorithms.
Most social media news consumers (64%) only use one social networking site to get news https://t.co/0Vjf8YiJEy pic.twitter.com/vLRjCDdlmV
— PewResearch Internet (@pewinternet) November 15, 2016
In October, Google News started attaching a "fact check" label to dubious stories and linking to trusted sites that debunked them — all achieved by an algorithm.
Or they could go the human route, and like more traditional news sources, hire journalistic editors to vet stories.
In the meantime, Facebook earns around 80% of its revenue from advertising. So, regardless of the veracity of news stories, the more we share, the more they earn.