Experts are warning tech companies need to further step up their efforts to stop the spread of child sexual abuse material (CSAM) on their platforms.
A parliamentary inquiry is investigating the capability of law enforcement to deal with the exploitation of children and examining the role technology providers have in assisting authorities.
The inquiry was established during the last parliament before being disbanded, now it has been resurrected with the Federal Police, eSafety Commissioner's office, non-government organisations (NGOs) and government agencies set to appear at a public hearing in Canberra on Tuesday.
'Transparency on the company's terms'
Acting eSafety Commissioner Toby Dagg will be appearing at Tuesday's hearing and says there is an urgent need for tech platforms to be more transparent with authorities.
"We know that there's a lot of good work being done in the tech industry, but transparency has been very much on the company's terms," he said.
"They've elected to create their own narratives for how they report on what they're doing to counter child sexual exploitation on their platforms.
"But there are big gaps in our knowledge, big gaps in terms of how we understand specific tools to be used and we have some gaps as well in terms of understanding how companies are investing in trust and safety.
"We have to be confident… that the industry actually is meeting the bar, that those kinds of standards are being attended to and extracting information from companies through compulsory transparency reporting mechanisms is one way to do that."
The commissioner's office recently issued Apple, Meta, Microsoft, Snap and Omegle with legal notices ordering them to disclose how they are addressing the proliferation of child sexual exploitation material on those platforms.
"It's happening right in front of us, it's happening right in the open, I think our instinct is to assume that it's happening on the dark web where we see so much other awful activity taking place," Mr Dagg said.
"That's why we're trying to shine a spotlight into those parts of the Internet that we all know as being part of our daily experience, to start to turn that tide and to really hold companies to account and show both what's failing, but also highlight what's working."
What is actually working?
The acting commissioner said tech companies are getting better at picking up on accounts that may be being used to share CSAM.
"Rather than just focusing on the detection of known child abuse material companies are investing in collecting behavioural signals that would point to a particular account as being suspicious or used for exploitative or abusive purpose," he said.
"We should expect companies to really take a forensic approach to examining how their messaging services, how their platforms more broadly are being used to facilitate the exploitation of children online."
Mr Dagg said programs like PhotoDNA — which create a unique signature for an image that can then be matched against signatures of other images — are fairly widely used to detect when known CSAM is uploaded and shared online.
"Though not nearly as widely as we expect it should," he said.
Project Karma CEO Glen Hulley will also be appearing at Tuesday's inquiry, his organisation works with communities to provide education, investigate cases and rescue victims of child sexual exploitation as well as provide them with support.
Mr Hulley said tech companies have made significant strides in tackling the spread of CSAM.
"There's a long way to go but compared to what it was like in 2012/2013, it was the Wild West," Mr Hulley said.
Project Karma works with Meta — the parent company of Facebook, Instagram and Whatsapp — as a part of their trusted partner program giving them a direct line to monitor for and report abuse material.
"(Tech companies have) done a lot of things, for instance, bringing in algorithms, AI, things that are always working in the background so that when this content comes onto their servers through a messaging platform or through an actual post on one of their news feeds or whatever, quite often that software or that AI algorithm will detect it automatically and remove it from the system," he said.
But while there have been improvements, both the acting commissioner and Mr Hulley said there's more that tech companies and the government can be doing.
Room for improvement
Acting Commissioner Dagg said while platforms have improved over the years at detecting when certain CSAM images and videos are being re-uploaded, there needs to be more consistent use of these technologies.
"There's been an inconsistent application of some of the tools that we know effective at detecting, for example, known child sexual abuse material," he said.
And companies need to dedicate more resources to finding out when new material is being shared.
"That's where we want to see a lot more investment and a lot more commitment," he said.
"We really want industry to step up to ensure that their platforms aren't being used as a vector to transmit that kind of content."
A former police officer, Mr Hulley agreed that companies need to be more transparent with law enforcement.
"I work closely with a lot of people in law enforcement and I hear the same things all the time from them about trying to get information from some of these big tech companies, it is quite difficult," Mr Hulley said.
Outside of technology companies, Mr Hulley sees a big problem in legislative constraints on non-government organisations leading to them being disconnected from authorities, with both sides sometimes doubling up on investigations and facing difficulties sharing intelligence.
"We're all operating in our little silos, that information isn't cross checkable and the with same law enforcement, they've got no idea what what we're seeing and vice versa."
He'd like to see governments make changes to foster collaboration between authorities and organisation's like his
"That is another step that needs to be looked at in how we can better manage intelligence across private and public sector."