Earlier this week, The Intercept published two internal moderation documents from TikTok that instructed staff on the platform’s content policies pertaining to users with certain political views, those of a certain socio-economic class, individuals with disabilities and more categories.
According to the first document, which dealt with users’ physical features, moderators should flag those with an “abnormal body shape, chubby, have obvious beer belly, obese, or too thin.” Also included were the elderly or those with “too many wrinkles” and users with “facial deformities” and “other disabilities.”
Under the “reason” column, which is meant to explain how TikTok higher-ups arrived at this policy, the document noted that in these cases “the character himself/herself is basically the only focus of the video, therefore, if the character’s appearance or the shooting environment is not good, the video will be much less attractive, not worthing to be recommended to new users.”
Moderators were also instructed to be on the lookout for shooting environments that were considered to be “shabby and dilapidated, such as, not limited to: slums, rural fields (rural beautiful natural scenery could be exempted), dilapidated housing, construction sites, etc.” According to the handout, those environments are “not suitable for new users for being less fancy and appealing.”
The second document outlined TikTok’s live policy and account ban procedures, dealing with topics ranging from hate speech and pornography to content that may go against the national interests of mainland China.
The handout noted that streams that attempt “to shame/degrade individuals or groups on certain attributes such as disability, gender, color, sexual orientation, nationality, ethnics, beliefs” could receive a monthlong ban.
According to The Intercept, the platform also has a number of “shadow accounts” run by TikTok employees posing as everyday users in order to influence the spread of certain content.
The social media platform has previously been criticized for allegedly providing the Chinese government with users’ personal information and suppressing political statements on topics such as the Hong Kong protests. However, the newly public documents provide a detailed look at how such content moderation decisions were made.
Additionally, the platform confessed to German digital rights blog Netzpolitik that it had implemented a number of policies to protect users who were believed to be “highly vulnerable to cyberbullying.” One of those policies required moderators to flag videos of users who appeared to have “facial disfigurement,” “autism” or “Down syndrome.”
According to a TikTok spokesperson who spoke with the outlet, the guidelines were implemented at the application’s inception “to counteract bullying on TikTok.”
“This approach was never intended to be a long-term solution and although we had a good intention, we realized that it was not the right approach,” the platform’s spokesperson said.
Josh Gartner, a TikTok spokesperson, told The Intercept that “most of” the livestream policies listed in the documents “are either no longer in use, or in some cases appear to never have been in place.” However, the outlet noted, he did not provide specifics.
As for those dealing with users’ looks or disabilities, Gartner said the “policies mentioned appear to be the same or similar” to those published in Netzpolitik. He asserted that these guidelines were representative of TikTok’s efforts “at preventing bullying, but are no longer in place, and were already out of use when The Intercept obtained them.”