Ministers should commit to a new version of the Online Safety Bill which strengthens regulation in order to better protect children, a charity has said.
The Molly Rose Foundation has warned that the current implementation of the Bill by new online safety regulator Ofcom has been risk averse and unambitious, while also exposing structural weaknesses in the Act which it says need to be fixed.
In a report published to mark one year since the Act was passed, the charity said it was concerned that Ofcom’s draft regulatory plans were not robust enough in holding tech firms to account, and did not truly grasp the size and scale of online threats, including suicide and self-harm content.
It also suggests placing a new duty of candour on tech firms, which would require them to disclose information to the regulator and be open and proactive when new online harms emerge.
The Online Safety Act is due to start coming into effect next year, and places new duties on social media platforms to protect users, particularly children, from harmful content, with large fines for those who fail to abide by the rules.
Ofcom is currently drafting new codes of practice across a range of policy areas and content types, which platforms will be required to follow.
The Molly Rose Foundation was set-up by the family of Molly Russell, who ended her life at age 14, in November 2017, after viewing harmful content on social media.
But Molly’s father Ian, who is the chair of the charity, said the rules still needed more work.
“Almost seven years after Molly’s death, we urgently need ministers to finish the job, with a strengthened Online Safety Act that makes clear measurable harm reduction is the North Star of this regime,” he said.
“While I firmly believe regulation is the best way to protect children from preventable harm, the reality is that timid regulation will cost lives.
“Ofcom has so far failed to grasp the nettle and respond decisively to preventable online harm.”
The charity’s chief executive, Andy Burrows, said: “By committing to strengthen the Online Safety Act, ministers can give confidence to parents and the country at large that credible, effective and decisive change is on the way.
“The Government should commit to a set of clear, effective changes that can build on the landmark Act and deliver the strong regulatory regime that our young people need and deserve.”
Alongside its report, the Foundation has published new research which it says shows parents and adults broadly support strengthening the online safety rules.
It said 84% of parents and 80% of parents backed a new version of the Act to bolster the regime, with 89% of adults saying they would like to see it be introduced in the first two years of this Parliament.
Since coming to power in July, the Labour Government has already strengthened the Act, with Technology Secretary Peter Kyle announcing in September that the sharing of revenge porn was being upgraded to a priority offence under the Act, so platforms would now have to take proactive steps to remove it.
At the time, Mr Kyle said he was also “open-minded” about broadening the powers of the Act, including possibly placing criminal liability on named senior managers at social media firms in the event of severe breaches.
“I’m open-minded as to what powers need to evolve into the future and where liability rests,” he said.
“But I want it to be proportionate and I want it to be effective – I’m not interested in finger-pointing at people unnecessarily.
“What I want to do is drive and incentivise behaviour change among any company that has access to British society, so that it benefits society and that any risks are mitigated as much as possible.
“Any company that puts these principles first and foremost in a tangible way will find us a Government that is totally on their side and will partner with them to make sure that every British citizen can benefit from their products, but also the jobs and wealth that is created from them.
While we’ve already seen some tech firms taking steps in the right direction, once the new duties start to come into force from December, they’ll have to do far more
“But those that don’t prioritise those principles will find us an ever assertive force when it comes to keeping people safer.”
In response to the Molly Rose Foundation report, an Ofcom spokesperson said: “We agree that it’s time for tech firms to take action to protect their users, especially children.
“The regulations we will finalise in the coming months, once we have finished seeking the views of children, parents and bereaved families, will be the most comprehensive put forward by any regulator in the world.
“And we’re confident they’ll deliver a step change in children’s online safety.
“Children must be protected from seeing pornography, suicide and self-harm material including by using highly effective age checks.
“Algorithms must not promote harmful content to children.
“While we’ve already seen some tech firms taking steps in the right direction, once the new duties start to come into force from December, they’ll have to do far more.
“And we won’t hesitate to take enforcement action if they fall short.”
In a further statement, Technology Secretary Mr Kyle said: “The Online Safety Act lays the foundations for a safer internet and in the coming months will protect against illegal content and harmful material for children.
“This Government will be watching closely to ensure the protections make the difference they promised.
“For too long safety has been an afterthought as technology is unleashed on our society; my mission is to turn this tide so safety is baked in from the start.
“We are already building on the Act; earlier this week we introduced new data laws that will help researchers gather critical evidence about online harms.
“This will be essential to informing our future action in protecting everyone online.”