In the very near future, Facebook will prompt users to fill out quality surveys revealing if they are familiar with a news source and if the answer is ‘yes,' whether or not they believe the source to be credible. Facebook will then use survey data to prioritize news source ranking in the social media platform newsfeed. News sources that consistently get low ratings by users will be penalized, in ways that are not yet known. This change will only apply to US users, although — if successful — Facebook plans to enforce the effort internationally.
"There's too much sensationalism, misinformation and polarization in the world today," Zuckerberg said in a Facebook post. "Social media enables people to spread information faster than ever before, and if we don't specifically tackle these problems, then we end up amplifying them. That's why it's important that [Facebook] News
Feed promotes high quality news that helps build a sense of common ground."
The new Facebook adaptation comes just one week after Zuckerberg announced that the social media leader would begin showing fewer unpaid posts on its site from publishers and other brands so as to prioritize what he termed "meaningful" interactions between friends and family.
"I announced a major change to encourage meaningful social interactions with family and friends over passive consumption. As a result, you'll see less public content, including news, video, and posts from brands. After this change, we expect news to make up roughly four percent of newsfeed — down from roughly five percent today. This is a big change, but news will always be a critical way for people to start conversations on important topics," Zuckerberg posted Friday.
However, according to Facebook executive Adam Mosseri, ranking sources based on user trust is unchartered territory and may not be as easy as it sounds.
"This is an interesting and tricky thing for us to pursue because I don't think we can decide what sources of news are trusted and what are not trusted, the same way I don't think we can't decide what is true and what is not," Mosseri said, cited by the Wall Street Journal.
Some news publishers have expressed concern over Facebook's move.
Neil Patel, publisher of conservative site Daily Caller, stated that, "For a company that wields this much power to make these kind of decisions with zero transparency really scares me," cited by the New York Times.
Nicco Mele, director of the Shorenstein Center on Media, Politics and Public Policy at Harvard University, noted that relying on user opinion to make credibility decisions is risky, at best.
"You may end up with reality television," Mele noted.
In his post, Zuckerberg detailed why he is relying on using the judgement of the Facebook community.
"The hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking."
"We decided that having the community determine which sources are broadly trusted would be most objective," Zuckerberg said.