A Wall Street Journal report from October 21 detailed how some staffers threatened to quit if they were not allowed to delete some of the Republican presidential candidate's posts on the site, particularly posts about Muslims in America or bans on Muslims entering the country.
They took issue with a December 7, 2015 post about Trump’s then-position that Muslims should be banned completely from entering the United States until the country could "figure out what is going on." The post linked to a page on the candidate’s website saying that "25% of [Muslims] polled agreed that violence against Americans here in the United States is justified as a part of the global jihad" and suggested that Muslims in America want to enact punishments such as beheadings.
"Without looking at the various polling data, it is obvious to anybody the hatred is beyond comprehension," Trump said in the statement.
Zuckerberg himself ruled that it would be "inappropriate to censor the candidate,” according to the Wall Street Journal, ultimately quashing the dissent among some staffers who claimed Trump was receiving special treatment because of his position. Though many staffers agreed with the decision, others complained that Trump was being allowed to "directly attack people," in violation of the company’s stated policy.
Facebook's community standards on hate speech say it removes speech that directly attacks people based on race, ethnicity, national origin, religious affiliation, sexual orientation; sex, gender, or gender identity; or disabilities or diseases. "Organizations and people dedicated to promoting hatred against these protected groups are not allowed a presence on Facebook."
On October 21, Facebook published a blog post about changes to its standards. "In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards. We will work with our community and partners to explore exactly how to do this, both through new tools and approaches to enforcement. Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them."
The Guardian suggested that this change brings Facebook closer to the media company its founder insists that it is not. "Two news items on Friday suggest that Facebook is instituting editorial standards analogous to those of a newspaper – and that Zuckerberg has the final say in matters of editorial judgment," the report comments.
USA Today reports that in an emailed statement on the matter, Facebook said, "When we review reports of content that may violate our policies, we take context into consideration. That context can include the value of political discourse."
"Many people are voicing opinions about this particular content and it has become an important part of the conversation around who the next US president will be,” the statement continued. “For those reasons, we are carefully reviewing each report and surrounding context relating to this content on a case by case basis."
Zuckerberg has also had to jump in recently to defend high-profile Trump supporter Peter Thiel's continued place on Facebook’s board of directors. Earlier in October, Forbes reported that Zuckerberg said Thiel's presence aids in "diversity."
"I want to quickly address the questions and concerns about Peter Thiel as a board member and Trump supporter," the CEO wrote in a post that seems to have been for Facebook employees. "We can't create a culture that says it cares about diversity and then excludes almost half the country because they back a political candidate. There are many reasons a person might support Trump that do not involve racism, sexism, xenophobia or accepting sexual assault." He continued to say that he understood that there are "strong views" about this presidential election, and that Facebook must be about "giving everyone the power to share our experiences."