Facebook Allowed COVID-19 Misinformation to Flourish on Its Platforms, Research Says

Meta, the company previously known as Facebook, has recently weathered a storm of bad press stemming from a bunch of internal documents leaked by former employee, Frances Haugen, which show that the social media company failed to shield its users from harmful online content.
Sputnik
Facebook and Instagram allowed scepticism about vaccines for the coronavirus and misinformation about the disease to spread across its platforms, with misleading content being spread over accounts that collectively have gained 370,000 followers over the past year, according to new research by credibility assessment website Newsguard, cited by the Guardian.
Among such content are posts in Facebook groups that claimed that children are being “murdered by the experimental jab they’re being pressured to take” and Instagram accounts that promoted a documentary by Andrew Wakefield, an embattled doctor who was struck off the UK medical register after he suggested that the MMR vaccine was linked to autism.
In total, Newsguard said it monitored some 20 sites since September last year, which had gained a total of 372,670 followers over this period of time. The monitor's report also included references to prominent anti-vaxxers and advocates of alternative medicine such as Robert F Kennedy Jr and Joseph Mercola.
Kennedy was banned from Instagram, but his Facebook page, along with Mercola's Instagram account, is said to have gained more than 140,000 followers since February.
The full report, according to The Guardian, has been sent to the World Health Organization.
Alex Cadier, UK managing director for NewsGuard, damned Facebook and Instagram for failing to protect their users from harmful and misleading content.
“The company’s engagement-at-all-costs mantra means that viral and divisive sources of misinformation continue to flourish, despite warnings from NewsGuard and the clear danger posed to users," the Guardian quoted him as saying. "Facebook gave itself a new name but their promotion of misinformation remains the same.”
A spokesman for Meta (previously known as Facebook) said that the company was taking action against misinformation while promoting vaccination. Meta said it has "removed more than 20m pieces of harmful misinformation" and "banned more than 3,000 accounts, pages and groups for repeatedly breaking our rules."
"We’re also labelling all posts about the vaccines with accurate information and worked with independent fact-checkers to mark 190m posts as false", the spokesperson said.
Fending off accusations that it had failed to tackle fake news and harmful content on its platform, Facebook announced its rebranding in late October, changing the company's name to Meta. According to Mark Zuckerberg, the company's chief executive, the rebrand reflects the company's focus on the "metaverse", not just the social media platform.
Will Zuckerberg Rue the Day Facebook Became Meta? How Historical Rebrands Only Sometimes Worked
Many observers, however, saw the rebrand as an attempt to clear the social media platform's reputation after it was undermined by 'The Facebook Papers', a bunch of stories by major news agencies based on the internal documents leaked by a former employee, whistleblower Frances Haugen who used to work for Facebook as a data analyst.
'The Facebook Papers' shed light on how the social media platform would foster misinformation and harmful content. Zuckerberg, however, dismissed the stories as a coordinated attempt by news agencies and whistleblowers to create "a false image" of his company.
On 29 October, Meta announced it will increase its support for COVID-19 vaccination efforts for children on its apps.
Discuss