Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Dan Milmo Global technology editor

Molly Russell: how family are helping shift narrative on online safety

Molly Russell
Molly Russell, who died in November 2017. Photograph: Family handout/PA

The online safety bill’s progress through parliament has been paused, but it is hard to see that delay lasting much longer after the conclusion of the Molly Russell inquest.

The regulatory landscape for the online world is undergoing significant change in the UK and Molly Russell’s family have contributed to that shift after becoming prominent campaigners for improved internet safety.

The bill has specific provisions for protecting children and Molly’s father, Ian, called for it to be introduced “urgently” after the hearing.

According to one figure closely involved in the development of the online safety bill, the family has played a crucial role in making the case for the landmark legislation. “The Russell family have made an unavoidable case for the online safety bill,” says Beeban Kidron, a crossbench peer who sat on the joint parliamentary committee that scrutinised the bill.

Kidron paid tribute to Russell, a 59-year-old TV director who has become an important voice on internet safety. The family has set up the Molly Rose Foundation, which is dedicated to connecting under-25s with the mental health support they need.

“Ian Russell’s campaigning ensured we heard evidence never before heard in open court,” says Kidron.

The two-week inquest detailed how Molly was able to view content related to suicide, depression, self-harm and anxiety on Instagram and Pinterest, with a child psychiatrist witness telling the hearing that the posts seen by the teenager were not safe.

Algorithms, which curate a user’s online experience, recommended 34 Instagram accounts to Molly that were either “sad or depressive related”, while Pinterest sent a message to Molly’s email address recommending “10 depression pins you might like”. A Pinterest executive admitted the material in those emails was “the type of content that we wouldn’t like anyone spending a lot of time with”.

The online safety bill places a duty of care on tech companies to shield children from harmful content and systems. They must conduct a professional risk assessment of the potential dangers posed to children by their platforms, and produce proposals to mitigate those risks.

Ofcom, the communications watchdog, will vet those proposals and monitor the companies’ adherence to them. Breaches of the bill can be met with fines of up to £18m or 10% of a company’s worldwide revenue. Instagram’s owner, Meta, recorded a turnover of $118bn (£106bn) last year.

The bill’s progress through parliament has been paused but it is expected to resume in late October with the child safety provisions staying intact, if not strengthened. Liz Truss has said she wants the bill amended to ensure greater protections for free speech, but measures to protect the young will stay. Anyone who has sat through the two-week inquest would find it hard to imagine any other outcome. On Friday the culture secretary, Michelle Donelan, committed to the bill and described it as “the answer” to preventing such a tragedy occurring again.

On Thursday, the children’s commissioner for England expressed fears that the Molly Russell case could be repeated, after research showed 45% of children aged eight to 17 have seen harmful content online, including material promoting self-harm and suicide.

William Perrin, a trustee of the Carnegie UK charity, says the bill will “go a very long way to addressing a fundamental problem, which is that social media platforms do not have in place proper systems and processes to protect children from harm”.

Another significant change to child internet safety is already in place, under the age-appropriate design code (AADC), which was introduced last year. It is a regulation that prevents websites and apps misusing children’s data, including in ways that are “detrimental” to their wellbeing.

Kidron, the architect of the AADC, says an example of detrimental use would be using data generated by a child’s online activity to steer them down harmful content rabbit holes.

“The code was not in place at the time that Molly died, when Meta was profiling her behaviour to deliver detrimental material to her on an industrial scale. What they did then is now a contravention of the code.”

• In the UK and Ireland, Samaritans can be contacted on 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.