Australia’s online safety regulator has given porn websites, social media companies, search engines and others in the tech industry six months to come up with rules to prevent children from accessing adult content.
Using powers under the Online Safety Act, the eSafety commissioner will require industry bodies to come up with a new code to prevent children from seeing content rated R18+ and above on their services or devices.
The codes will cover app stores, apps, websites including porn sites, search engines, social media services, hosting services, internet service providers, instant messaging, multiplayer gaming and online dating services.
As part of the code, companies will be expected to make “reasonable efforts” to check users’ age, having opt-out default safety measures such as safe search and parental controls, and allowing users to filter or blur unwanted sexual content.
It comes two months after the $6.5m trial of age assurance technology was announced by the Albanese government in May’s budget.
The rules will be designed by groups including the Digital Industry Group, the Communications Alliance, Interactive Games and Entertainment Association and the Australian Mobile Telecommunications Association.
The eSafety commissioner, Julie Inman Grant, said that requiring technology companies in different sectors to work on the code would mean that “there isn’t a single point of failure”.
“The larger porn sites actually have fairly robust age verification provisions in place [but] there are going to be rogue porn sites all over the internet that are never going to comply,” she said.
Inman Grant said age checks would happen via safeguards on smartphones, and through app stores before kids can get to those sites.
While some of the larger adult sites have methods of age assurance in place, some have bristled at regulator attempts abroad to enforce age verification. Pornhub blocked access to users in Texas after the state government passed a law requiring companies to verify the ages of users.
On Friday Meta told a parliamentary committee that it believed that age checks were best left to app stores run by Apple and Google, which are best placed to protect user privacy.
A spokesperson for Meta said its platforms already prohibit pornography, sexually explicit content and links to external pornographic sites.
“We have technology that proactively finds and removes such graphic content. Additionally, we restrict the recommendation of sexually suggestive content in places like Instagram Explore and Reels, especially to teens, who by default have the strictest recommendation settings applied to what they see.”
The tech companies are expected to hold a public consultation on the proposed codes, and a draft of the code must be submitted to the commissioner by October, with final codes due by 19 December.
The commissioner will then decide whether to accept them, at which point they will become enforceable in mid-2025.
Whether or not it will eventuate in a new code is not certain. After seeking industry input on terrorist content and child abuse material in 2022, the eSafety commissioner brought in two mandatory standards instead.
As Guardian Australia reported last month, the commissioner also opted against requiring end-to-end encrypted communications services to scan for such content if it would weaken their services.