There may be more trouble ahead (or at least more music to be faced and regulator tunes to be danced to) for TikTok and some of its social-video peers.
The UK’s media and telecoms regulator Ofcom has announced new guidance aimed to provide “better protections from harmful online videos”. The emphasis will be on videos that are hateful, violent or disturbing, and/or racist, with Ofcom also publishing research suggesting that this material has been experienced by 32%, 26% and 21% of users of these platforms respectively.
Ofcom wants the services to provide clear terms and conditions on what’s not acceptable to upload; have easy reporting and complaint processes; and have “robust age verification in place” if they are hosting pornographic content.
“Ensuring an age-appropriate experience on platforms popular with under-18s” is also on Ofcom’s to-do list. Platforms who fail its tests could be fined up to £250k, or even suspended entirely in the UK.
This only affects platforms established (as businesses) in the UK though: TikTok, Snapchat, Twitch and OnlyFans fall into that category, but Facebook and YouTube do not – regulated instead in Ireland.