After all the recent controversies around data privacy and censorship, TikTok’s leadership won’t have enjoyed seeing this intro to a story on news site Netzpolitik. “Leaked documents reveal how TikTok hid videos of people with disabilities. Queer and fat users were also pushed out of view.”
The story concerns a list of ‘special users’ whose posts’ reach was capped by moderators. Which sounds very bad, even though the argument is that the company was trying to protect these people, rather than to hide them away.
TikTok’s moderation guidelines from the time this policy was in place show, according to Netzpolitik, that reach should be limited for users who were “susceptible to harassment or cyberbullying based on their physical or mental condition”. This included limiting videos from people with disabilities to their own country, while people who a moderator decided (sometimes in as little as 30 seconds) had autism, Down syndrome or ‘facial disfigurement’ could be put on an ‘Auto_R’ list that would ensure their videos were not recommended to other users once they had reached a few thousand views.
TikTok’s response for the article was to stress that the policy had been chosen “at the beginning” of the app’s life. “This approach was never intended to be a long-term solution and although we had a good intention, we realised that it was not the right approach,” said a spokesperson.