Social app TikTok is under a *lot* of scrutiny in 2019 over the younger end of its community. In February, the company agreed to pay $5.7m in a settlement with the US Federal Trade Commission over violations of children’s privacy laws, before blocking access to its app for children aged under 13 in the US.
Now the company is doing some more things for users who are older than 13, but not yet adults. It is upgrading its app’s ‘Restricted Mode’, which when activated attempts to ensure that “inappropriate content for minors will be filtered from their For You feed”. The feature is activated via a password that is valid for 30 days.
TikTok says that its filter is “powered by AI” – so it’s sure to be watched closely by regulators and children’s-safety campaigners, who are well aware that past filters of this kind (YouTube Kids, for example) have come in for criticism when bad content slipped through.
In separate news, TikTok has also added some new ‘Screen Time Management’ features. Users were already able to set a limit of two hours a day within the app, but now they can choose between a wider range of limits: 40, 60, 90 and 120 minutes per day. Again, it’s password-protected, so parents can (if their child lets them, of course) make sure the limit holds.