Another day, another reason for TikTok’s policy and public relations teams to open their inboxes and grimace.
Friday’s reason was an open letter sent to the company’s head of safety Eric Han by a group of charities including Friends of the Earth; Global Witness; Media Matters for America; the NSPCC and the Molly Rose Foundation.
The latter being the suicide-prevention charity set up in the wake of the 2017 death of British teenager Molly Russell, and the impact that social media had on her wellbeing. That’s relevant to the focus of the letter.
“We write to you as concerned researchers, activists and parents regarding the damaging effect of your platform’s content algorithm on the mental health and well-being of children,” it explained. “We believe it is your responsibility to take swift and decisive action to address this issue.”
The charities want TikTok to strengthen its content moderation policies around harmful eating disorder and suicide content; work with mental health experts and organisations on those policies; provide more resources and support to young users who may be struggling with these issues; and report regularly on the steps it is taking on all of this.
TikTok has responded by telling the Guardian that “our community guidelines are clear that we do not allow the promotion, normalisation or glorification of eating disorders, and we have removed content mentioned in this report that violates these rules” – and that it seeks “to engage constructively with partners who have expertise on these complex issues, as we do with NGOs in the US and UK”.
In related news, TikTok has told UK communications regulator Ofcom that it blocks around 180,000 suspected underage (i.e. under 13 years old) accounts in Britain every month.