According to documents obtained by the New York Times, 1,400 pages are used to guide more than 7,500 moderators as they approve or reject content. However, not all of these pages are accurate and updated.
READ MORE: Facebook Suspends Page of Russiagate Accuser for “False Flag Operation”
Moderators were led to removing fund-raising appeals for volcano victims in Indonesia because a co-sponsor of the drive was on Facebook’s internal list of banned groups. In Myanmar, a paperwork error allowed an extremist group, accused of fomenting genocide, to stay on the platform for months. In India, moderators were mistakenly told to take down comments critical of religion.
The company confirmed the authenticity of the documents, according to the Times, yet claimed that the rulebook is used for training. The employees, however, speaking to the paper on condition of anonymity, said that the rules actually apply during their daily routines.
Another problem is that the guiding slides for Facebook moderators are written for English speakers relying on Google Translate, without getting into any specifics about local linguistic or cultural traditions. One moderator said there is a rule to approve any post if it's in a language that no one available can read.
Monika Bickert, Facebook’s head of global policy management, said that the primary goal was to prevent harm and that to a great extent, the company had been successful.
“We have billions of posts every day, we’re identifying more and more potential violations using our technical systems,” Bickert said. “At that scale, even if you’re 99 percent accurate, you’re going to have a lot of mistakes.”
Facebook has been accused repeatedly throughout the year of having a lack of transparency in the processing of users’ data, including allowing third-party access to the personal data of tens of millions of people without their consent.