Google-owned YouTube is not only miserably failing to live up to the company's stated intention to limit the spread of hateful diatribes and misinformation. The world’s second-most visited website has been found complicit in pushing the afore-mentioned “disturbing” video content via its recommendation algorithm, states a report by the Mozilla Foundation, published on 7 July.
From conspiracy theories about the 9/11 terror attacks on the US and the ongoing coronavirus pandemic, to promotion of so-called “white supremacy” and inappropriate “children's” cartoons, YouTube’s algorithm has been implicated in driving some 71 percent of its violent content, according to the research.
The nonprofit found that a majority of problematic videos were recommended by the video-sharing platform's algorithm. Social media platforms like YouTube have vehemently rejected a clamour to share information about their algorithms, citing user privacy.
However, to address the growing body of evidence suggesting that social media's recommendation algorithms amplify the spread of misinformation and violent content, in 2020 Mozilla empowered users to take part in a crowdsourced study.
The nonprofit launched RegretsReporter, a browser extension and research project, to probe the extent to which YouTube’s algorithm can drive users toward more extreme content.
37,380 YouTube users were enlisted, volunteering data about the so-called "regrettable experiences” they have had on YouTube, flagging 3,362 videos, originating from 91 countries, between July 2020 and May 2021.
According to the report, after the data was studied by Mozilla, three main findings were determined.
Firstly, the discovered most frequent “regret” categories were misinformation, violent or graphic content, hate speech, and spam/scams.
Secondly, the search recommendation algorithm was singled out as the principal problem, with over 70 percent of the reports flagging videos recommended to volunteers by YouTube’s automatic recommendation system.
Finally, non-English speakers were deemed the most affected, as the rate of YouTube Regrets was 60 percent higher in countries that do not use English as a primary language. Countries such as Brazil, Germany and France were particularly high on the list, showed the study.
Covid-19 pandemic-related issues were prevalently flagged in non-English languages.
“YouTube’s algorithm is working in amplifying some really harmful content and is putting people down disturbing pathways. It really shows that its recommendation algorithm is not even functioning to support its own platform policies, it's actually going off the rails,” said Brandi Geurkink, the foundation's senior manager of advocacy, was cited by Politico as saying.
Seeking to go further than just “diagnosing” the principal problem, Mozilla sought to offer recommendations as guidance for YouTube and other internet platforms. This included enabling researchers to audit recommendation systems, with platforms urged to publish information about how the recommendation systems work.
Furthermore, policymakers are called upon to require YouTube to create tools enabling independent scrutiny of their recommendation algorithms.
In response to the report, a YouTube spokesperson said:
"The goal of our recommendation system is to connect viewers with content they love and on any given day, more than 200 million videos are recommended on the homepage alone." It was added that YouTube "constantly" seeks to improve users' experience and has launched 30 different changes to reduce recommendations of harmful content in the last year.