TikTok is the only good social media app. At least, it used to be. From 2019 until 2021, the algorithm was magic. It had a way of sending videos that you didn’t know you were interested in until you saw them. For me, that meant synchronised dances, vegan recipes and elaborate songs created from internet drama. Then came the Amber Heard and Johnny Depp defamation trial.
For weeks on end this summer, my TikTok feed was hijacked by weird videos valorising Depp and offering violently hateful commentary about his ex-wife. It didn’t matter how many times I clicked “not interested” or swiped past without watching, more would appear. Whether it came from real fans, opportunists or bots, the pro-Depp content was relentless.
When the case ended, so did the videos. But my algorithm has never fully recovered. Instead of jokes, I’ve landed on a side of TikTok that’s full of protein powder recommendations and gory surgery videos. It’s like a 1990s lads’ mag come to life.
Clinging to memories of the good times, I’ve spent the past few weeks trying to retrain my algorithm. This, it turns out, is not easy. It’s not just a matter of clicking “like” on the videos that interest you. You have to speed past the ones you don’t like and save the ones that you do.
TikTok’s algorithm is supposed to be particularly insightful because the app harvests so much data. The company, which is owned by China’s ByteDance, looks closely at the length of time a user spends watching a video. It adds that to information it collects about their location, age, gender and search history. There is a lot of content to scrutinise too. Videos can be shot on smartphones and uploaded easily. They are short so more of them can be watched in one sitting. The fact that all this data is being collected by a Chinese-owned company is something that has raised frequent security concerns in the US.
Yet what keeps more than a billion users like me hooked is that TikTok also drops in random videos now and again instead of boring users by showing the same sort of content. In the US, TikTok is watched for longer than YouTube, according to data taken from Android phone users by analytics company App Annie. Rival social media companies have given the app the ultimate compliment by trying to bolt TikTok-like formats on to their own platforms. For Instagram, it is Reels. For YouTube, it’s Shorts.
As I found out, however, that randomness can also be the source of problems. Unlike Twitter or Instagram, which focus attention on content from people you choose to follow, TikTok serves up strangers. That can leave feeds open to unpleasant videos. Last year, the company said that it would try to improve this and stop users being taken down rabbit holes of upsetting content. But I suspect plenty of creators have become more adept at working the algorithm and pushing certain content forward.
The final step I took to improve my TikTok algorithm was to block creators whose videos I was being shown repeatedly and to filter out certain words. It’s labour intensive but it seems to have worked. The gross-out videos have been replaced with remixes of a kid talking about how much he likes sweetcorn.
Still, the TikTok glory days of endless jokes and cleverly inventive videos seem to be coming to an end – on my feed at least. Once celebrities and advertisers moved in, the quality of videos on the app came down. Even after undertaking an algorithm repair job, I may need to find a new way to waste my own time.
Elaine Moore is the FT’s deputy Lex editor