TikTok pays $5.7m in FTC settlement over children’s privacy


Early on in our coverage social-video app, we warned that its large audience of pre-teens represented a big risk for the company, in terms of staying on the right side of children’s privacy legislation like COPPA in the US. Yesterday, that particular hammer dropped – on TikTok, whose parent company Bytedance acquired before merging it with TikTok last year.

The company has agreed to pay $5.7m to settle allegations by the US Federal Trade Commission (FTC) that it illegally collected personal information from children – “the largest civil penalty ever obtained by the Commission in a children’s privacy case” according to the FTC. The app’s operators must now remove all videos made by children under the age of 13, and comply with the COPPA rules going forward.

(Here’s our usual reminder that the app had no relation to us, Music Ally the music-industry knowledge company.)

The FTC was blunt in its verdict. “The operators of – now known as TikTok – knew many children were using the app but they still failed to seek parental consent before collecting names, email addresses, and other personal information from users under the age of 13,” said FTC chairman Joe Simons. “This record penalty should be a reminder to all online services and websites that target children: We take enforcement of COPPA very seriously, and we will not tolerate companies that flagrantly ignore the law.”

Two FTC commissioners, Rohit Chopra and Rebecca Kelly Slaughter, added more criticism of in a separate statement about its “disturbing” data collection. “In our view, these practices reflected the company’s willingness to pursue growth even at the expense of endangering children,” they said.

So what will TikTok do now? Besides launching a new series of videos on user-safety “we’ve now implemented changes to accommodate younger US users in a limited, separate app experience that introduces additional safety and privacy protections designed specifically for this audience,” announced the company in its own blog post. “The new environment for younger users does not permit the sharing of personal information, and it puts extensive limitations on content and user interaction.”

The new features went live yesterday, and will see children unable to share videos of their own; comment on other people’s videos; message with other users; or maintain their own profile or followers. Instead, they’ll be able to watch “curated content” and “experiment” with TikTok’s creative features. We suspect this won’t go down well with the average 11 year-old, so TikTok’s challenge will be how to tackle underage users trying to get around the restrictions by accessing the full app.

There was some good news for TikTok yesterday. App analytics company Sensor Tower claims that “the app has just crossed the one billion mark for worldwide installs on the App Store and Google Play, including its lite versions and regional variations” – and this doesn’t include installs from non-Google Android app stores in China.

“Approximately 663 million of these installs occurred in 2018. To put this into perspective, the Facebook app was installed an estimated 711 million times last year and Instagram saw about 444 million new downloads,” added Sensor Tower. It claims that 25% of TikTok’s downloads to date have come from India – and last month, 43% of its new users were from that country, which is rolling out its own legislative crackdown on these kinds of social apps.

TikTok is huge; it has some creative features and culture; and it’s working with artists and labels in some very interesting ways. But dealing with its challenge around children’s privacy (and safety on the platform – just this week, UK children’s charity the NSPCC suggested this app category is a “hunting ground” for adult abusers) is absolutely essential. The FTC settlement also puts any social-video app with a mixed-age audience on notice that these issues can’t be shoved under the carpet.

Written by: Stuart Dredge