Meta called on US lawmakers on Wednesday to regulate Google and Apple’s app stores to better protect children, the same day that the Senate began investigating Meta’s failures to shield children using its platforms.
In a blogpost titled Parenting in a Digital World Is Hard. Congress Can Make It Easier, Antigone Davis, Meta’s global head of safety, called for federal legislation that would mandate app stores to notify parents whenever a child between the age of 13 and 16 downloads an app, and would solicit the parents’ approval. Children under 13 are already prohibited from creating accounts and downloading apps without a parent’s go-ahead.
Meta’s blogpost does not mention Google or Apple by name, but the two operate the biggest smartphone app stores in the world, the Play Store for Android and the App Store of iPhone’s iOS. Any legislation regulating children’s app downloads would target both.
Davis argued that there was “a better way” to govern smartphone and internet usage than laws that require a parental thumbs-up for a child to create a social media account. Utah, for example, began requiring the parents of people under 18 to consent to using TikTok, Instagram, Facebook and other apps in March to preserve, in the words of the state governor, Spencer Cox, “the mental health of our youth”.
Davis’s call was published the same day that the Senate judiciary committee sent a letter to Mark Zuckerberg, Meta’s CEO, requesting that he “provide documents related to senior executives’ knowledge of the mental and physical health harms associated with its platforms, including Facebook and Instagram”. The letter asks for the documents by 30 November. Neither Google nor Apple provided statements by press time.
After the publication of this story, Meta issued a statement: “We’ve always said that we support internet regulation, particularly when it comes to young people. That said, we’re concerned about what is amounting to a patchwork of divergent laws across a variety of US states. Laws that hold different apps to different standards in different states will leave teens with inconsistent online experiences.” The company maintains it has supported legislation on “clear industry standards” on parental supervision and age verification since 2021.
The National Society for the Prevention of Cruelty to Children blasted Meta in a response: “It is pleasing to see that regulation of social media is already focusing the minds of top tech executives on protecting children. But Meta has sat on its hands while knowing the harm their platforms can cause children and is now seeking to pass the buck when they should be getting their own house in order.”
The Senate committee’s preliminary investigation comes a week after a former high-level Meta employee testified before its members about the harm Instagram can do to children, including his own daughter. He said Meta’s leaders ignored his concerns when he raised them internally.
Arturo Bejar, a former engineering director at Instagram, told the panel of senators: “I appear before you today as a dad with first-hand experience of a child who received unwanted sexual advances on Instagram.”
The same subject was a main focus of the testimony of another Meta whistleblower, Frances Haugen, who leaked internal documents to the US government about how company executives ignored warnings about the detrimental effects of social media use on teen girls. She testified before Congress in October 2021.