On a previous version of the TikTok app, vaping, smoking and tobacco videos were not easy to find. When users searched for terms like “tobacco” and “juul” they were redirected to a special section with educational information about substance use, followed by a curated feed of videos promoting mental health and solidarity in struggles.
For example, when Fortune searched the term “vape” in this version of the app, TikTok returned a video of woman telling viewers that jogging is “the best medicine there is” when she is having a “bad mental health day.”
But now, in the current version (29.6) of TikTok, searches for the term “vape” and other tobacco-related searches return videos from TikTok users like "Itshhutsonn."
“Suck on it,” Itshhutsonn says in a video demonstrating his technique for inhaling from a vape pen. "You pull a drag on it and hold it” he explains, advising viewers to “keep holding it” and “don’t let it out.”
The clip is just one of an endless stream of similar videos on TikTok that Fortune discovered recently, revealing a clear change in the social media company's efforts to censor (or in this case, not censor) potentially harmful content. Searching terms like “vape,” “hookah” “cigarettes” as well as tobacco and nicotine brands including “skoal,” “zyn” and “juul” yield a treasure trove of content that normalizes and glorifies tobacco and nicotine use, often featuring young users.
TikTok now says it made a mistake: the tobacco videos should never have been blocked.
A spokesperson for TikTok told Fortune the mental health-related videos should not have been there. A technical problem affecting older versions of the app had mistakenly displayed the videos about jogging and mental health instead of clips about people like "Itshhutsonn" and his vape pen.
It might seem like an odd admission, but the company says its policy with regards to tobacco has not changed.
So long as user videos don’t involve “the trade” of tobacco, alcohol, or drugs, TikTok will not block them. TikTok will, however, preface the results of these searches with a resource center that links out to content about substance abuse (which users can simply scroll beyond). And for minors, the company says it won’t show any such content at all.
As Fortune discovered in its own experiments though, TikTok’s treatment of tobacco-related content often strays from its stated policies, and the supposed glitches affecting its handling of tobacco content raise questions about the company’s practices controlling what users see on its service.
Josh Golin, the executive director of FairPlay, a nonprofit dedicated to children's wellbeing online, believes the change in videos that TikTok showed users searching for tobacco suggests that the company is—or was—aware of the potential implications of tobacco-focused content on the app.
"It's hard to argue that this is all just happening by mistake, right? Because the fact that they have the resource center—there is a clear indication that they understand that this search is sensitive, and does need some kind of moderation," Golin said.
Golin expressed skepticism when told that TikTok ascribed the change to a technical glitch. "It doesn't even matter if it's a technical glitch," he said. "If they don't have the ability to prevent serving harmful content on these searches until they're alerted to it by a reporter, then clearly, they're not doing their due diligence or being responsible in any way."
The findings represent another worrying aspect of the social media app used by 200 million Americans at a time when it is already under increased scrutiny for its privacy practices and its influence on the millions of teenagers who use it everyday. Last month, the U.S. Surgeon General issued a report that explicitly calls on policymakers to develop health and safety standards around “substance abuse” content for children on social media.
“It’s deeper than Googling something like ‘tobacco’ because the algorithm feeds you more content. So if kids are searching for ‘tobacco’ on TikTok, they’re constantly being fed similar content,” says Fareedah Shaheed, CEO and founder of children-focused cybersecurity company Sekuva. “It means that your kid is going to be inundated with content that is harmful.”
TikTok's weak safeguards for underage users
TikTok, like Facebook, Twitter, Snap, and other platforms, accepts neither advertising nor user accounts from tobacco companies.
“Our Community Guidelines make clear that we do not allow the trade of alcohol, tobacco products or drugs, nor do we allow content showing young people possessing those products. In order to support our community, we also redirect searches on these topics to in-app resources that provide access to expert advice and support,” a company spokesperson told Fortune.
That said, the platforms have no oversight of influencer brand deals so it’s plausible that tobacco companies have inked covert agreements with individuals. Jon-Patrick Allum, an assistant professor of research population and public health sciences at the University of Southern California’s Keck School of Medicine who studies tobacco content on TikTok believes this is the case. He suggests that a TikToker who goes by FreezerTarps may be sponsored by nicotine pouch company Zyn, as his channel is almost exclusively dedicated to promoting Zyn and he sells merchandise that integrates Zyn into the branding of large American universities. An example of this is one for the “University of Zynnesota” with the mascot Goldy Gopher wearing a burgundy “Z” sweater. (In response to repeated queries from Fortune about his affiliation with Zyn, the user did not respond.)
“When you look at TikTok, and you see the average age of users, and the fact that the majority of content that each individual user is exposed to is discoverable, you really get this idea that a lot of the content that is tobacco-related can be shaping the attitudes and behaviors of young people, especially when it’s normalized,” says Allum.
The TikTok spokesperson said that the platform restricts underage users from viewing content that shows excessive alcohol, tobacco substance consumption.
However, multiple Fortune employees, Josh Golin of FairPlay and researchers at University of Southern California’s Keck School of Medicine found virtually no difference between the tobacco use content shown to accounts belonging to underage users and adults. “The primary thing this would do is normalize vaping and tobacco use,” says Golin, who used an account belonging to a teen to search tobacco terms on TikTok. “This is something you would want to be careful with—on what your search results are returning to a minor.”
It’s worth noting that after a number of calls and emails with Fortune about this, TikTok seems to have suppressed some tobacco glorification and use content on tobacco-related terms being searched on an account registered to a minor.
TikTok's low margin for error
TikTok is hardly alone among social media platforms whose practices are under the microscope. In early May, the Federal Trade Commission accused Facebook-parent company Meta of violating users’ privacy by failing to control communications in its Messenger Kids app, and plans to bar the company from monetizing kid data. But with calls to ban TikTok in the U.S. growing louder, the service has a particularly low margin for error.
What TikTok lacks in favor among legislators, it makes up for in popularity among young Americans. In 2022, TikTok was the most-used social media app by children in the United States who spent an average of 113 minutes on the platform every day, per Statista. The company recently rolled out a feature that notifies underage users when they’ve used that platform for 60 minutes in one day, though Time reports kids being unmoved by the feature.
Similarly, e-cigarettes and tobacco products are addictive substances testing teens and parents. A 2022 survey by the National Youth Tobacco Survey found that more than 3 million middle and high school students currently use tobacco, with 16.5% of high schools student using tobacco products. Alarmingly, another 2022 study found that 10.5% of youth e-cigarette users consume products within five minutes of waking up every day.
Those who oppose TikTok in the United States have made a case that the platform’s Chinese owners are filling the minds of American youth with quick dopamine hits from senseless short-form video. As evidence, they argue, the app’s Chinese equivalent Douyin (TikTok is not available in China) surfaces STEM content, limits usage to 40 minutes per day and is unavailable to users between 10pm to 6am, per CNN.
While TikTok and its executives have dedicated extensive airtime and resources to assuring parents that the company has kids’ best interests at heart, the claim is belied by the company’s spotty track record with tobacco.