Apple is introducing a new feature to iMessage in Australia that will allow children to report nude images and video being sent to them directly to the company, which could then report the messages to police.
The change comes as part of Thursday’s beta releases of the new versions of Apple’s operating systems for Australian users. It is an extension of communications safety measures that have been turned on by default since iOS 17 for Apple users under 13 but are available to all users. Under the existing safety features, an iPhone automatically detects images and videos that contain nudity children might receive or attempt to send in iMessage, AirDrop, FaceTime and Photos. The detection happens on devices to protect privacy.
If a sensitive image is detected, the young user is shown two intervention screens before they can proceed, and given the offer of resources or a way to contact a parent or guardian.
With the new feature, when the warning comes up, users will also have the option to report the images and videos to Apple.
The device will prepare a report containing the images or videos, as well as messages sent immediately before and after the image or video. It will include the contact information from both accounts, and users can fill out a form describing what happened.
The report will be reviewed by Apple, which can take action on an account – such as disabling that user’s ability to send messages over iMessage – and also report the issue to law enforcement.
Apple said the plan would be initially to roll out the feature in Australia in the latest beta update, but that it would be released globally in the future.
The timing of the announcement as well as picking Australia as the first region to get the new feature coincides with new codes coming into force. By the end of 2024, tech companies will be required to police child abuse and terror content on cloud and messaging services that operate in Australia.
Apple had warned that the draft of the code would not protect end-to-end encryption and would leave the communications of everyone who uses the services vulnerable to mass surveillance. The Australian eSafety commissioner ultimately watered down the law, allowing companies that believe it would break end-to-end encryption to demonstrate alternative actions to tackle child abuse and terror content they would take instead.
Apple has faced strong criticism from regulators and law enforcement across the globe over its reluctance to compromise end-to-end encryption in iMessage for law enforcement purposes. Apple abandoned plans to scan photos and videos stored in its iCloud product for child sexual abuse material (CSAM) in late 2022, eliciting further rebuke. Apple, WhatsApp and other advocates for encryption say that any backdoors to encryption endanger users’ privacy globally.
The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) had accused Apple of vastly undercounting how often CSAM appears in its products, the Guardian revealed in July.
In 2023, Apple made just 267 reports of suspected CSAM on its platforms worldwide to the National Center for Missing & Exploited Children (NCMEC), which is vastly lower compared to other tech giants in the industry, with Google reporting more than 1.47m and Meta reporting more than 30.6m, the NCMEC’s annual report states.
• In the US, call or text the Childhelp abuse hotline on 800-422-4453 or visit their website for more resources and to report child abuse or DM for help. For adult survivors of child abuse, help is available at ascasupport.org. In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International