Radio Sputnik discussed with Mona Elswah, a research assistant at the prestigious Oxford University, a recent incentive by Google to employ an estimated 10,000 moderators to identify and eradicate inappropriate video content from YouTube.
Google will employ thousands of people to weed out extremist, violent and pornographic content on its video sharing website YouTube that is said to endanger children, according to Google CEO Susan Wojcicki in an article published by the Daily Telegraph.
Radio Sputnik discussed the initiative with Elswah who is currently working at the storied university's Computational Propaganda Project.
"It's a very important step for YouTube to do it," she told Sputnik. "The amount of videos on Youtube is enormous. If one person would decide to watch every video on Youtube, it would take 3,600 years to watch all of them."
And that figure would ostensibly not include sleeping, eating and whatever other activities would be required for the hypothetical Methuselah to complete the task.
According to Elswah, machine learning tools are unfit for the complexity of the job.
"How would a machine understand if a cartoon character is violent or extremist," she speculated. "That's why we need human reviewers to update the machine learning algorithms."
Elswah expects the subtlety of content to increase, making the search for inappropriate video troublesome even for human moderators.
It is very difficult, according to the researcher, to evaluate the scale of the problem, as users easily come across videos whose titles sound innocuous but whose content can be alarming at best, making it hard for children or parents to know what they're getting into.
Making people register with an authentic user ID is not a solution either, Elswah said, as the move would limit freedom of expression.
"It doesn't work like that. It would just kill every essence of the internet," she asserted.
When asked about balancing inappropriate content against freedom of expression, Elswah argued that the tech company must develop more strict and elaborate rules than are in place currently.
YouTube must educate users, she added. Google must "teach children and even adults that they actually affect lives with video they are posting," she said.
Even more complicated is live streaming, as there are many incidents depicting crime, rape, suicide and other horrific acts.
"You can't expect what that person is going to stream," she acknowledged.
Elslwah's suggested there must be appropriate penalties in place as a counter to users who would post inappropriate content. The research assistant does not suggest legal punishment, but offers that the threat of account deactivation would deter most, causing them to "think twice" before streaming something inappropriate.