
In a tightening push to safeguard minors online, Australia’s online safety regulator eSafety has now included Twitch in its social media ban affecting users under 16. The popular streaming platform will now have to take “reasonable steps” to avoid being slapped with fines of up to AUD 49.5 million ($32m or £25m).
Starting Dec. 10, 2025, ten platforms—including Twitch, Kick, Instagram, Threads, Facebook, Snapchat, X, YouTube, Reddit, and TikTok—will be required to take reasonable steps to prevent users under the minimum age of 16 from holding accounts. It’s worth noting that users can still continue to use Twitch and other banned platforms for content that doesn’t require creating an account.

According to TechCrunch, a Twitch spokesperson has confirmed that existing under-16 accounts in Australia will be deactivated from Jan. 9, 2026. Users outside of Australia can still sign up for an account as long as they’re age 13 or above, but anyone under the legal age of adulthood in their region must have a parent or guardian’s consent.
According to eSafety’s assessment, Twitch qualifies as an “age-restricted social media platform” on the grounds that it has “the sole or significant purpose of online social interaction with features designed to encourage user interaction, including through livestreaming content.”
“Twitch is a platform most commonly used for livestreaming or posting content that enables users, including Australian children, to interact with others in relation to the content posted,” eSafety wrote in a press release. It also announced that Pinterest doesn’t meet the exact same criteria (interaction isn’t a significant purpose) and is safe—for now.
ESafety has conducted platform assessments to help industry and families prepare for Australia’s new social media minimum age rules taking effect on Dec. 10, although final determinations on whether a service qualifies as age-restricted ultimately rest with the courts. It expects all online platforms operating in Australia to understand and comply with their legal obligations and has provided a self-assessment tool to support this process.
On the surface, Australia’s move appears reasonable, given the clear risks online platforms pose to young teens, from harmful content to cyberbullying and addictive behaviours. Outside of such restrictions, most major platforms officially require users to be above the age of 13 to sign up for an account. But there are often ways to bypass this requirement. ESafety aims to push these platforms to enforce stricter rules, potentially with verification via government IDs, face or voice recognition, or age inference through online activity.
However, a rigid ban may push kids towards more hidden online spaces rather than promoting safer, guided use. Not to forget, it also places most of the the burden on tech companies, rather than on broader education and parental responsibility.

Several other countries have also embraced a similar direction to enforce social media safety, but with varying degrees of enforcement. For example, in Denmark, the government has proposed a ban on social media use for children under 15. But here’s the catch: there are limited exceptions that would allow users aged 13 and 14 access if approved by a parent. This measure is not yet in force, though.
In France, back in 2023, a law introduced a “digital age of majority” at 15. So, platforms here are required to verify users’ ages and obtain parental consent for anyone younger. Last month, Members of the European Parliament proposed an EU-wide ban on social media access for children under 16, aiming to enforce stricter age verification and increase platform accountability. A hearing on the proposal is scheduled for later this month.
In the United States, eight states have already enacted laws restricting social media use for minors under 18 (only allowed under parental guidance) despite First Amendment court challenges.
Australia’s proactive and comprehensive regulatory approach is pretty impressive. Success, however, will depend on balancing enforcement with stronger digital literacy programs and better support for families, and not by simply restricting access.