Facebook Products Under 'Reputational Review' After Whistleblower, Media Question Safety
23:21 GMT 06.10.2021 (Updated: 18:21 GMT 03.11.2022)
© AP Photo / Martin MeissnerA man passes a facebook screen
© AP Photo / Martin Meissner
Subscribe
On Tuesday, former Facebook employee Frances Haugen testified before a US Senate panel, urging the creation of a new regulatory framework that demands increased transparency. The whistleblower warned that inaction would allow the profit-focused social media giant to continue obscuring internal research on the known dangers of its products.
In light of Haugen's testimony and related Facebook safety concerns, executives at the tech company have reportedly stalled a number of new product rollouts and are now in the process of conducting a series of "reputational reviews" to forecast public response to the products, and to ensure these developments do not adversely impact children.
More than a dozen individuals have been tasked with conducting the reviews, the Wall Street Journal reported on Wednesday, citing people familiar with the matter.
Facebook has remained a hot topic in mainstream US media since the Wall Street Journal published its 'Facebook Files' series, which examined the social media platform's divisive algorithm, questionable moderation practices and even its possible link to drug cartels and human trafficking groups.
Haugen revealed during a feature '60 Minutes' appearance on Sunday that she was the individual who provided the WSJ with a number of internal documents from Facebook.
During the interview, the former Facebook employee notably declared that the "version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world."
Haugen expounded on that assertion in her written testimony to the US Senate Subcommittee on Consumer Protection, Product Safety, and Data Security Hearing.
© REUTERS / Matt McClainFormer Facebook employee and whistleblower Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing entitled 'Protecting Kids Online: Testimony from a Facebook Whistleblower' on Capitol Hill, in Washington, U.S., October 5, 2021
Former Facebook employee and whistleblower Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing entitled 'Protecting Kids Online: Testimony from a Facebook Whistleblower' on Capitol Hill, in Washington, U.S., October 5, 2021
© REUTERS / Matt McClain
She detailed that, after joining Facebook in 2019, there were several instances in which Facebook ignored issues highlighted by internal research, and chose profit over safety -- even in instances involving "vulnerable groups, like teenage girls."
"The result has been a system that amplifies division, extremism, and polarization — and undermining societies around the world. In some cases, this dangerous online talk has led to actual violence that harms and even kills people," Haugen testified. "In other cases, their profit optimizing machine is generating self-harm and self-hate."
Haugen has called on lawmakers to set up an oversight board to address Facebook's lack of transparency, and provide regulators and the public with more insight into the known dangers of the platform and its products.
Zach Vorhies, a former senior software engineer at YouTube and Google turned whistleblower, told Sputnik News that Haugen's testimony will likely result in regulation changes for Facebook because mainstream media coverage is present this time around.
"It’s my assumption that the media is making a big deal out of this person because they have plans to use her to increase social media regulation in the near future," Vorhies said.
Facebook's algorithms were highlighted by the whistleblower as one of the key issues affecting users. According to a memo obtained by the WSJ, a team of Facebook data scientists sounded the alarm over an algorithm that disproportionately boosted hateful voices.
"Our approach has had unhealthy side effects on important slices of public content, such as politics and news," the team wrote. "This is an increasing liability."
A separate report echoed political concerns about the algorithm's "long-term effects on democracy" in the US.
Facebook CEO Mark Zuckerberg has maintained that Facebook values the safety of its users, as evidenced by the platform's recent implementation of the Meaningful Social Interaction (MSI) change to the Facebook news feed, which has brought "fewer viral videos and more content from friends and family," according to the CEO's lengthy Tuesday statement.
"Is that something a company focused on profits over people would do?" Zuckerberg stated.
Vorhies highlighted that Facebook's newfound focus on MSI is in reality a contributing factor to the platform's lack of live management and reliance on artificial intelligence and automated surveillance.
"Engineers that increase MSI are promoted. Changes at Facebook will include cultural changes with how engineers are directed," Vorhies noted. "If changes are made, engineers at Facebook will be managed less by driving improvements in the MSI and more by management."
Additional Facebook-related hearings are expected to take place in the near future, according to congressional aides, cited by the WSJ. Additionally, some senators have reportedly reached out to Facebook for additional information, while other lawmakers are weighing whether to subpoena internal documents and data.