Earlier this year, the Wall Street Journal published a much-discussed analysis of TikTok’s recommendation algorithm. Specifically its potential to lead people down some worrying rabbit-holes for topics including depression and election conspiracies.
It seems that article was taken seriously at TikTok: it has just announced some plans to “interrupt repetitive patterns” in its #ForYou feed, particularly where they can “inadvertently reinforce a negative personal experience for some viewers”.
The company says that it is “testing ways to avoid recommending a series of similar content – such as around extreme dieting or fitness, sadness, or breakups – to protect against viewing too much of a content category that may be fine as a single video but problematic if viewed in clusters”.
People will also be able to choose specific words and hashtags for content they don’t want to see in their #ForYou feed. The changes sound sensible, but as ever, their full impact won’t be known until they roll out – and are analysed by independent observers.