TikTok has been fined €345m by Ireland’s privacy regulator the Data Protection Commission (DPC) over its processing of children’s personal data in the EU.
The case focused on the default profile settings for children on TikTok; its age verification system and its transparency information. Children, in this case, are defined as 13-17 year-olds, with the DPC’s investigation based on the period between 31 July and 31 December 2020.
The DPC announced its final decision on Friday, after a dispute resolution process with the European Data Protection Board (EDPB), which made its own announcement that day too. You can also read the full 126-page decision for the details of the investigation.
“Social media companies have a responsibility to avoid presenting choices to users, especially children, in an unfair manner – particularly if that presentation can nudge people into making decisions that violate their privacy interests,” said EDPB chair Anu Talus.
TikTok has responded with a statement. “We respectfully disagree with the decision, particularly the level of the fine imposed,” its spokesperson told Music Ally. “The DPC’s criticisms are focused on features and settings that were in place three years ago, and that we made changes to well before the investigation even began, such as setting all under 16 accounts to private by default.”
The company has also responded to the decision and fine at greater length in a blog post by Elaine Fox, its head of privacy in Europe. It goes into detail about what TikTok has done to address each of the issues raised in the investigation, while also flagging up its efforts to remove accounts by underage accounts – that’s under-13s.
(Fox said nearly 17m such accounts were removed globally in the first three months of 2023 alone. This number is published on a quarterly basis on the transparency section of TikTok’s website. The Q1 report offers backdated figures too: TikTok removed 46.4m accounts suspected to be under the age of 13 in 2021, then 78.4m in 2022.)
€345m is a big fine, although for context, reports earlier this year suggested that TikTok’s global revenues in 2022 were not far shy of $10bn. This is not the company’s first brush with regulators over children’s privacy though.
Earlier this year TikTok was fined £12.7m by the UK’s Information Commissioner’s Office (ICO) for misusing children’s data – in that case, focusing on under-13s. Back in 2019, meanwhile, it paid $5.7m in a settlement with the US Federal Trade Commission (FTC) in a case relating to its predecessor app Musical·ly.
However, TikTok is just one of several tech firms to have had its knuckles rapped by regulators this year. Amazon paid $25m in a settlement with the FTC over a case involving children’s privacy and its Alexa voice assistant.
Meanwhile, Meta has copped €390m and €1.2bn fines from Ireland’s DPC over its handling of adults’ data under the EU’s GDPR regulations, and Spotify was fined €5m by Swedish regulator IMY over its own GDPR deficiencies – although here too the focus was adults, not children.
In 2022, Epic Games paid $520m in two settlements with the FTC over Fortnite, including $275m for violating the COPPA children’s privacy regulations. That year also saw Instagram fined €405m by the DPC over its handling of the data of children aged 13-17.
It’s striking how often the response of these companies is a variation on ‘this investigation covered how we used to do things; we’ve already fixed those problems; and it’s not how we operate now. Look at all the marvellous things we’re doing now to keep our users safe!’ Even if fixing those problems and doing those marvellous things have been spurred by the initial announcement of a regulator’s investigation.
All of which is to say: the takeaway from this story isn’t simply ‘TikTok had a children’s privacy problem’. It’s bigger than that: it’s about a range of technology companies falling foul of regulations designed to protect consumers, being pulled up on them a few years later, and promising that they’re already doing better. At least until the next probe.
It’s all relevant to the music industry. Partly because as music streaming continues to grow, DSPs will increasingly be under the microscope too: Spotify’s €5m fine is an early example of that.
But in the wider social-media landscape, music companies use these platforms to advertise and market; and our stars help to drive fans (teens and under-13s included) to those services too. We’re part of that ecosystem, even if we’re not currently the ones on the hook for privacy violations, so it’s important that we keep abreast of the regulatory issues around it.