Meta estimates about 100,000 children using Facebook and Instagram receive online sexual harassment each day, including “pictures of adult genitalia”, according to internal company documents made public late Wednesday.
The unsealed legal filing includes several allegations against the company based on information the New Mexico attorney general’s office received from presentations by Meta employees and communications between staff. The documents describe an incident in 2020 when the 12-year-old daughter of an executive at Apple was solicited via IG Direct, Instagram’s messaging product.
“This is the kind of thing that pisses Apple off to the extent of threatening to remove us from the App Store,” a Meta employee fretted, according to the documents. A senior Meta employee described how his own daughter had been solicited via Instagram in testimony to the US Congress late last year. His efforts to fix the problem were ignored, he said.
The filing is the latest in a lawsuit initiated by the New Mexico attorney general’s office on 5 December, which alleges Meta’s social networks have become marketplaces for child predators. Raúl Torrez, the state’s attorney general, has accused Meta of enabling adults to find, message and groom children. The company has denied the suit’s claims, saying it “mischaracterizes our work using selective quotes and cherry-picked documents”.
Meta issued a statement in response to Wednesday’s filing: “We want teens to have safe, age-appropriate experiences online, and we have over 30 tools to support them and their parents. We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online.”
A 2021 internal presentation on child safety was also referenced in the lawsuit. According to the suit, one slide stated that Meta is “underinvested in minor sexualization on IG, notable on sexualized comments on content posted by minors. Not only is this a terrible experience for creators and bystanders, it’s also a vector for bad actors to identify and connect with one another.”
The complaint also highlights Meta employees’ concerns over child safety. In a July 2020 internal Meta chat, one employee asked: “What specifically are we doing for child grooming (something I just heard about that is happening a lot on TikTok)?” According to the complaint, he received a response: “Somewhere between zero and negligible.”
Meta’s statement also says the company has taken “significant steps to prevent teens from experiencing unwanted contact, especially from adults”.
The New Mexico lawsuit follows a Guardian investigation in April that uncovered how Meta is failing to report or detect the use of its platforms for child trafficking. The investigation also revealed how Messenger, Facebook’s private messaging service, is used as a platform for traffickers to communicate to buy and sell children.
Meta employees discussed the use of Messenger “to coordinate trafficking activities” and facilitate “every human exploitation stage (recruitment, coordination, exploitation) is represented on our platform”, according to documents included in the suit.
Yet, an internal 2017 email describes executive opposition to scanning Facebook Messenger for “harmful content” because it would place the service “at a competitive disadvantage vs other apps who might offer more privacy”, the lawsuit states.
In December, Meta received widespread criticism for rolling out end-to-end encryption for messages sent on Facebook and via Messenger. Encryption hides the contents of a message from anyone but the sender and the intended recipient by converting text and images into unreadable cyphers that are unscrambled on receipt. Child safety experts, policymakers and law enforcement have argued encryption obstructs efforts to rescue child sex-trafficking victims and the prosecution of predators. Privacy advocates praised the decision for shielding users from surveillance by governments and law enforcement.