Internal Documents Show Facebook Knew it Was Used to Incite Violence in Ethiopia

© REUTERS / Erin ScottFacebook Chairman and CEO Zuckerberg testifies at a House Financial Services Committee hearing in Washington
Facebook Chairman and CEO Zuckerberg testifies at a House Financial Services Committee hearing in Washington - Sputnik International, 1920, 25.10.2021
Documents shared with CNN as part of a US Securities and Exchange Commission investigation show that Facebook knew it was being used to incite violence in Ethiopia and did little to combat it.
According to the documents, Facebook employees repeatedly raised concerns over the company’s failure to stop the proliferation of misinformation and violence-inciting posts in Ethiopia and other at-risk countries. Facebook, through a spokesperson, insists it has done enough to curb violence on the platform, while rights groups and whistleblowers argue their efforts are underfunded and quasi-performative.
FILE PHOTO: Former Facebook employee and whistleblower Frances Haugen testifies during a hearing entitled 'Protecting Kids Online: Testimony from a Facebook Whistleblower' in Washington - Sputnik International, 1920, 25.10.2021
The Facebook Papers
Facebook Whistleblower Tells UK MPs Platform 'Unquestionably Making Hate Worse'
Ethiopia has been gripped by a civil war over the past year, and militias have made use of Facebook to make calls for violence against ethnic minorities. Politicians have also used the platform to spark divisiveness. Facebook has come under similar fire for its role in Myanmar’s genocide of Rohingyans.
While Facebook’s efforts to police hate speech in America have been criticized, its efforts in the rest of the world are likely far worse. According to Facebook, the platform has 1.84 billion daily users and 2.8 billion monthly users. Close to three-fourths of all its users are outside of North America and Europe.
Documents indicate the company has failed to grow its staff and develop local language resources to catch and prevent hate and violence-inciting speech to keep up with its explosive global growth. Facebook whistleblower Frances Haugen believes the events in Myanmar and Ethiopia are not one-offs, but rather a sign of things to come.
"I genuinely fear that a huge number of people are going to die in the next five to ten years, or twenty years, because of choices and underfunding," Haugen said.
"The raw version [of Facebook] roaming wild in most of the world doesn't have any of the things that make it kind of palatable in the United States, and I genuinely think there's a lot of lives on the line -- that Myanmar and Ethiopia are like the opening chapter," she added.
A string of whistleblowers and reports have consistently said that Facebook prioritizes profitability over safety. Facebook has long defended its platform by suggesting it is not its duty to police what its users say. However, they have also made investments to help catch and curb problematic speech and actors that utilize the platform.
A Facebook logo is displayed on a smartphone in this illustration taken January 6, 2020. - Sputnik International, 1920, 23.10.2021
New Facebook Whistleblower Says Company Undermined Efforts to Combat Hate Speech
Through a spokesperson, Facebook said that they have invested "$13 billion and have 40,000 people working on the safety and security on our platform, including 15,000 people who review content in more than 70 languages working in more than 20 locations all across the world to support our community.”
To put those numbers in context, Facebook reported revenues of $85.9 billion in 2020 and allegedly has 1.84 billion daily users. Their $13 billion investment represents 15% of their 2020 revenue, and it’s unknown if the investment figure is over a period of time or if that is $13 billion in new spending.
Facebook suggesting that 15,000 people can adequately review the content of 1.84 billion daily users is incredibly optimistic. That would mean that each reviewer would be responsible for the posts of 122,666 people every day. If reviewing each person’s post only took one second, it’d take 34 hours per day for 15,000 people to review the content of 1.84 billion daily users.
The civil war in Ethiopia was not caused by Facebook, but the platform’s ability to rapidly disseminate dangerous speech and the company’s disinterest in investing in adequate safeguards could have led to the loss of human life.
To participate in the discussion
log in or register
Заголовок открываемого материала