The Department of Justice and the Federal Trade Commission sued social media platform TikTok and its parent company, ByteDance, on Friday, accusing the wildly popular app of collecting and retaining personal information about millions of children under the age of 13.
According to the complaint, TikTok has knowingly let kids create accounts on the social media platform, make and share videos, and communicate and interact with adults. Moreover, the DOJ said, TikTok didn’t delete children’s accounts even after parents asked it to. The government is seeking to “put an end to TikTok’s unlawful massive-scale invasions of children’s privacy,” the complaint states.
“TikTok knowingly and repeatedly violated kids’ privacy, threatening the safety of millions of children across the country,” said FTC Chair Lina Khan. “The FTC will continue to use the full scope of its authorities to protect children online—especially as firms deploy increasingly sophisticated digital tools to surveil kids and profit from their data.”
Authorities said the Children’s Online Privacy Protection Act (COPPA) bans website and app operators from gathering information like email addresses and preferences from children younger than 13, unless the companies have express permission from parents. Congress enacted COPPA in 1998 to protect children younger than 13 from having their personal information retained by websites. The law applies to any website or online service directed at children, the complaint states. The U.S. sued TikTok predecessor Musical.ly in 2019 for violating COPPA provisions and fined it $5.7 million. The government also ordered the company to destroy personal information of users under 13. ByteDance bought Musical.ly in 2018, merging it with TikTok. Despite the merger, the company still had to comply with the 2019 court order, DOJ said.
“The Department is deeply concerned that TikTok has continued to collect and retain children’s personal information despite a court order barring such conduct,” said acting associate attorney general Benjamin C. Mizer. “With this action, the Department seeks to ensure that TikTok honors its obligation to protect children’s privacy rights and parents’ efforts to protect their children.”
In a statement to Fortune, a TikTok spokesperson denied the accusations.
“We disagree with these allegations, many of which relate to past events and practices that are factually inaccurate or have been addressed,” said the spokesperson in an email. “We are proud of our efforts to protect children, and we will continue to update and improve the platform. To that end, we offer age-appropriate experiences with stringent safeguards, proactively remove suspected underage users, and have voluntarily launched features such as default screen-time limits, Family Pairing, and additional privacy protections for minors.”
ByteDance is estimated to be worth $225 billion, and as of 2024 there were north of 170 million people using TikTok in the U.S., the complaint states. In 2022, 61% of teenagers who reported using TikTok were 13 or 14 years old.
The FTC noted that TikTok’s own employees raised red flags about internal policies. After kids’ accounts weren’t deleted, a compliance staffer said, “We can get in trouble … because of COPPA.”
The complaint states that these communications, from a September 2021 online chat, indicate an awareness of the problem, which might have been ongoing from as far back as July 2020. Another employee, in the online chat, said she encountered the issue frequently.
“I run across usually like 3-4 accounts [like that] a day.”