On 6 April, Maryland became the first state in the US to pass a “Kids Code” bill, which aims to prevent tech companies from collecting predatory data from children and using design features that could cause them harm. Vermont’s legislature held its final hearing before a full vote on its Kids Code bill on 11 April. The measures are the latest in a salvo of proposed policies that, in the absence of federal rules, have made state capitols a major battlefield in the war between parents and child advocates, who lament that there are too few protections for minors online, and Silicon Valley tech companies, who protest that the recommended restrictions would hobble both business and free speech.
Known as Age-Appropriate Design Code or Kids Code bills, these measures call for special data safeguards for underage users online as well as blanket prohibitions on children under certain ages using social media. Maryland’s measure passed with unanimous votes in its house and senate.
In all, nine states across the country – Maryland, Vermont, Minnesota, Hawaii, Illinois, New Mexico, South Carolina, New Mexico and Nevada – have introduced and are now hashing out bills aimed at improving online child safety. Minnesota’s bill passed the house committee in February.
Lawmakers in multiple states have accused lobbyists for tech firms of deception during public hearings. Tech companies have also spent a quarter of a million dollars lobbying against the Maryland bill to no avail.
Carl Szabo, vice-president and general counsel of the tech trade association NetChoice, spoke against the Maryland bill at a state senate finance committee meeting in mid-2023 as a “lifelong Maryland resident, parent, [spouse] of a child therapist”.
Later in the hearing, a Maryland state senator asked: “Who are you, sir? … I don’t believe it was revealed at the introduction of your commentary that you work for NetChoice. All I heard was that you were here testifying as a dad. I didn’t hear you had a direct tie as an employee and representative of big tech.”
For the past two years, technology giants have been directly lobbying in some states looking to pass online safety bills. In Maryland alone, tech giants racked up more than $243,000 in lobbying fees in 2023, the year the bill was introduced. Google spent $93,076, Amazon $88,886, and Apple $133,449 last year, according to state disclosure forms.
Amazon, Apple, Google and Meta hired in-state lobbyists in Minnesota and sent employees to lobby directly in 2023. In 2022, the four companies also spent a combined $384,000 on lobbying in Minnesota, the highest total up to that point, according to the Minnesota campaign finance and public disclosure board.
The bills require tech companies to undergo a series of steps aimed at safeguarding children’s experiences on their websites and assessing their “data protection impact”. Companies must configure all default privacy settings provided to children by online products to offer a high level of privacy, “unless the covered entity can demonstrate a compelling reason that a different setting is in the best interests of children”. Another requirement is to provide privacy information and terms of service in clear, understandable language for children and provide responsive tools to help children or their parents or guardians exercise their privacy rights and report concerns.
The legislation leaves it to tech companies to determine whether users are underage but does not require verification by documents such as a driver’s license. Determining age could come from data profiles companies have on a user, or self-declaration, where users must enter their birth date, known as “age-gating”.
Critics argue the process of tech companies guessing a child’s age may lead to privacy invasions.
“Generally, this is how it will work: to determine whether a user in a state is under a specific age and whether the adult verifying a minor over that designated age is truly that child’s parent or guardian, online services will need to conduct identity verification,” said a spokesperson for NetChoice.
The bills’ supporters argue that users of social media should not be required to upload identity documents since the companies already know their age.
“They’ve collected so many data points on users that they are advertising to kids because they know the user is a kid,” said a spokesperson for the advocacy group the Tech Oversight Project. “Social media companies’ business models are based on knowing who their users are.”
NetChoice – and by extension, the tech industry – has several alternative proposals for improving child safety online. They include digital literacy and safety education in the classroom for children to form “an understanding of healthy online practices in a classroom environment to better prepare them for modern challenges”.
At a meeting in February to debate a proposed bill aimed at online child safety, NetChoice’s director, Amy Bos, argued that parental safety controls introduced by social media companies and parental interventions such as parents taking away children’s phones when they have racked up too much screen time were better courses of action than regulation. Asking parents to opt into protecting their children often fails to achieve wide adoption, though. Snapchat and Discord told the US Senate in February that fewer than 1% of under-18 users on either social network had parents who monitor their online behavior using parental controls.
Bos also ardently argued that the proposed bill breached first amendment rights. Her testimony prompted a Vermont state senator to ask: “You said, ‘We represent eBay and Etsy.’ Why would you mention those before TikTok and X in relation to a bill about social media platforms and teenagers?”
NetChoice is also promoting the bipartisan Invest in Child Safety Act, which is aimed at giving “cops the needed resources to put predators behind bars”, it says, highlighting that less than 1% of reported child sexual abuse material (CSAM) violations are investigated by law enforcement due to a lack of resources and capacity.
However, critics of NetChoice’s stance argue that more needs to be done proactively to prevent children from harm in the first place and that tech companies should take responsibility for ensuring safety rather than placing it on the shoulders of parents and children.
“Big Tech and NetChoice are mistaken if they think they’re still fooling anybody with this ‘look there not here’ act,” said Sacha Haworth, executive director of the Tech Oversight Project. “The latest list of alleged ‘solutions’ they propose is just another feint to avoid any responsibility and kick the can down the road while continuing to profit off our kids.”
All the state bills have faced opposition by tech companies in the form of strenuous statements or in-person lobbying by representatives of these firms.
Other tech lobbyists needed similar prompting to Bos and Szabo to disclose their relevant tech patrons during their testimonies at hearings on child safety bills, if they notified legislators at all. A registered Amazon lobbyist who has spoken at two hearings on New Mexico’s version of the Kids Code bill said he represented the Albuquerque Hispano Chamber of Commerce and the New Mexico Hospitality Association. He never mentioned the e-commerce giant. A representative of another tech trade group did not disclose his organization’s backing from Meta at the same Vermont hearing that saw Bos’s motives and affiliations questioned – arguably the company that would be most affected by the bill’s stipulations.
The bills’ supporters say these speakers are deliberately concealing who they work for to better convince lawmakers of their messaging.
“We see a clear and accelerating pattern of deception in anti-Kids Code lobbying,” said Haworth of the Tech Oversight Project, which supports the bills. “Big tech companies that profit billions a year off kids refuse to face outraged citizens and bereaved parents themselves in all these states, instead sending front-group lobbyists in their place to oppose this legislation.”
NetChoice denied the accusations. In a statement, a spokesperson for the group said: “We are a technology trade association. The claim that we are trying to conceal our affiliation with the tech industry is ludicrous.”
These state-level bills follow attempts in California to introduce regulations aimed at protecting children’s privacy online. The California Age-Appropriate Design Code Act is based on similar legislation from the UK that became law in October. The California bill, however, was blocked from being passed into law in late 2023 by a federal judge, who granted NetChoice a preliminary injunction, citing potential threats to the first amendment. Rights groups such as the American Civil Liberties Union also opposed the bill. Supporters in other states say they have learned from the fight in California. They point out that language in the eight other states’ bills has been updated to address concerns raised in the Golden state.
The online safety bills come amid increasing scrutiny of Meta’s products for their alleged roles in facilitating harm against children. Mark Zuckerberg, its CEO, was told he had “blood on his hands” at a January US Senate judiciary committee hearing on digital sexual exploitation. Zuckerberg turned and apologized to a group of assembled parents. In December, the New Mexico attorney general’s office filed a lawsuit against Meta for allegedly allowing its platforms to become a marketplace for child predators. The suit follows a 2023 Guardian investigation that revealed how child traffickers were using Meta platforms, including Instagram, to buy and sell children into sexual exploitation.
“In time, as Meta’s scandals have piled up, their brand has become toxic to public policy debates,” said Jason Kint, CEO of Digital Content Next, a trade association focused on the digital content industry. “NetChoice leading with Apple, but then burying that Meta and TikTok are members in a hearing focused on social media harms sort of says it all.”
A Meta spokesperson said the company wanted teens to have age-appropriate experiences online and that the company has developed more than 30 child safety tools.
“We support clear, consistent legislation that makes it simple for parents to manage their teens’ online experiences,” said the spokesperson. “While some laws align with solutions we support, we have been open about our concerns over state legislation that holds apps to different standards in different states. Instead, parents should approve their teen’s app downloads, and we support legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps.”