Meta Platforms has opposed the attempt by the U.S. Federal Trade Commission (FTC) to revise its 2020 privacy settlement, saying that it voluntarily reported two issues in its children's chat application to the agency and no complaints were received from parents.
In a filing on Thursday, the parent company of Facebook said that it had disclosed the bugs in its Messenger Kids app back in July 2019 and it had allocated $5.5 billion towards its privacy program and other related initiatives.
The case relates to Facebook's 2020 privacy settlement, which prohibits the exploitation of minors' data and extends restrictions on facial recognition technology.
FTC is planning to strengthen the terms of the settlement, alleging that Meta misled parents regarding children's protections.
A US appeals court ruled in March that Meta cannot prevent the FTC from reopening an investigation into Facebook's privacy practices despite its arguments that it has already paid a $5 billion fine and implemented various safeguards.
Facebook introduced Messenger Kids in 2017 for children under 13, surpassing the minimum age requirement for a Facebook account. The app has additional privacy measures, such as parental approval for each friend request.
Meta's filing on Thursday said fewer than 2,000 children were impacted by the bug, which the company claimed to have resolved within 24 hours, according to Bloomberg.
While rectifying a chat-related bug, Meta employees found a similar issue allowing users to add unapproved contacts to video calls, the company said in its statement. The bug remained for three months on Apple devices and a few days on Android devices before being addressed, according to the filing.
Meta, which owns WhatsApp, Instagram and Facebook, has also initiated a series of legal confrontations with the FTC, including a federal lawsuit contesting the agency's framework and procedures as unconstitutional, in an attempt to stop the reopening of its 2020 privacy settlement.
The company previously said the FTC's proposed changes would "curtail Meta's development of new products, superintend Meta's corporate governance, and impair Meta's ability to serve its users and advertisers."
Despite resistance from the FTC, Meta embarked on the development of an Instagram version tailored for children in 2021. However, the project was halted following substantial criticism from child advocates and media reports highlighting the potentially detrimental effects of the company's apps on teenagers' mental well-being.