Online safety experts will struggle to sound the alarm about harmful content if landmark legislation does not allow independent researchers to access data from social media platforms, campaigners have warned.
The government is being urged to adopt amendments to the online safety bill enabling researchers to access platform data in order to monitor harmful material. Access would be overseen by Ofcom, the communications watchdog, and would protect user privacy.
The amendments tabled by the Conservative peer James Bethell are supported by groups including the Center for Countering Digital Hate (CCDH) and the children’s safety charity the NSPCC. They are also backed by the Molly Rose Foundation, established by the family of Molly Russell, a 14-year-old who took her own life after viewing social media content related to depression, suicide and self-harm.
In an open letter to the government, the campaign groups said the bill was in “serious peril” of passing without the power to make social media platforms more transparent. The bill is passing through the Lords, with outstanding amendments put forward by peers due to be debated on Thursday as it enters the final stages of its legislative progress.
“We ask the government to support the data access amendments, supporting the researchers, academics and experts who work in real-time to empower and inform the British public about the online harms that affect their lives,” said the letter.
The letter adds that online safety researchers are already being stymied by Twitter, which is charging £33,000 a month to access data that was free previously.
Imran Ahmed, the chief executive of CCDH, said that without Lord Bethell’s amendment “we might well end up with less transparent, less accountable and less safe social media platforms”.
The Bethell amendment will require platforms to follow a code of practice, set by Ofcom, that will oversee independent researchers’ access to platform data and what sort of harms they can research – with platforms then providing the requisite data for researchers to investigate those issues. Researchers will also have to meet standards on independence and privacy before being appointed by Ofcom.
Concerns about the amendment include the mandating of access to sensitive data and the potential for unintended consequences to handing over such information to researchers.
A government spokesperson said: “The online safety bill is currently undergoing detailed scrutiny in the House of Lords, with a wide range of issues being debated. The government will continue to consider arguments made by peers at committee stage, and in the proceeding stages of the bill’s passage.”
Other amendments put forward in the Lords include strengthening online age-checking to ensure children are not exposed to pornography and requiring Ofcom to issue a code of practice on preventing violence against women and girls that social media platforms would have to follow when implementing their duties under the bill.
The prospect of tougher age-checking measures has alarmed some platforms. The Wikimedia Foundation, which hosts Wikipedia, has warned it will not carry out age verification if required to do so by the act, raising the possibility it will fail to comply with the legislation. The bill requires sites such as Wikipedia to actively prevent children from encountering pornographic material, with the bill in its current form referring to age verification as one of the possible tools for this.
Rebecca MacKinnon, the Wikimedia Foundation’s vice-president of global advocacy, told the Guardian Wikipedia would be “out of compliance” with the people if it was required to monitor users’ ages because of the organisation’s pledge to collect minimal data on users.
“The Wikimedia movement is creating and running platforms for very specific educational and public interest purposes. We’re nonprofit. We do not collect user data,” she said.