United Kingdom leaders are pushing forward with a massive online censorship bill that, thanks to the lobbying of a group of lawmakers over the weekend, has been made significantly harsher with threats of imprisonment for tech platform managers who run afoul of the complicated regulations.
The Online Safety Bill has been under construction in the U.K. Parliament and various government committees for nearly a year. The massive bill (the current version spans 260 pages) establishes "duty of care" responsibilities for tech platforms to keep what the government deems "online harms" (which is broader than just violent or pornographic content) out of the view of children.
The bill flat-out forces platforms to serve as censors or face significant fines—£18 million ($22 million) or 10 percent of a company's global revenue, whichever is higher. The rules will be overseen by the Office of Communications (Ofcom), the U.K. government's media regulator.
Reason has warned about this bill in the past, particularly given the country's willingness to use the law to punish people who say things the government deems "offensive." Last year, Joseph Kelley, a citizen of Glasgow, Scotland, tweeted out an insult of an elderly military veteran who had become a "national inspiration" for his resilience in accepting the coronavirus lockdowns (and later died of COVID-19 complications because he was unable to be vaccinated for health reasons). Kelley was prosecuted under a British law against "grossly offensive" messages and sentenced to 150 hours of community service.
Over the weekend, a group of conservative lawmakers succeeded in forcing reforms to the Online Safety Bill to make it even more punishing, and they were willing to sink the whole bill to get their way. Brits probably would have been better off if that had been the case. Instead, the act is now in the process of being amended to add a potential two-year prison sentence to managers of platforms who ignore enforcement orders from Ofcom to remove or censor content or to otherwise make sure children don't have access to it.
To be clear, just in case what happened to Kelley doesn't spell it out, we're talking about censorship above and beyond pornography or content that would fall under unprotected speech, even under the U.K. law. Matthew Feeney, a former Reason editor and Cato scholar, now serves as head of tech and innovation for England's Centre for Policy Studies. The think tank warns that the bill would "do far more harm than good in its current form, and should either be scrapped or seriously amended," and that was before the current proposed changes to make it even harsher.
Feeney explains that while the bill has been changed to remove a contentious proposition to allow for orders to censor "legal but harmful" content online, those authorities are still present within this bill in any web space where children may have access. And that will create a massive compliance problem that will most certainly encourage platforms to censor all sorts of speech. Using examples of discussion of suicide online, Feeney writes:
It might initially seem obvious what youth suicide promotion content looks like. But a video that promotes suicide can be harmful in one context (e.g encouraging young people to end their lives) and helpful in another (e.g a mental health charity showcasing the content as part of a seminar series on youth mental health). A social media company seeking to allow the latter but not the former would likely remove both if NC2 makes its way into the final version of the legislation. The result would be fewer online resources available for children struggling with suicidal thoughts, eating disorders, harassment, abuse, and other difficult issues that require online firms to adopt nuanced and flexible content moderation practices in order to be effective.
In other words, in order to reduce the risk of earning the ire of Ofcom regulators, tech platforms under this law are likely to censor in overly broad ways even to the point that information that is genuinely helpful to minors, but about a controversial subject, is removed.
Oh, and what is harmful content? Whatever the U.K. secretary of state for digital, culture, media, and sport decides. The bill's text literally lets one government official declare content harmful to minors and, therefore, mandate social media platforms to take efforts to keep children from being exposed to it.
Who the bill may affect is not entirely clear. Wikimedia Foundation Vice President of Global Advocacy Rebecca MacKinnon said she's concerned that the bill will hamper the work of Wikipedia, which is dependent on volunteer editing. The BBC further notes that while people might think this is all about Twitter, Facebook, and social media platforms, there are many more ways for people to communicate with each other online. Would a multiplayer game have to monitor its players' speech? Would anybody operating on a private online server be held responsible for what users share with each other?
"The cure is very much worse than the disease," Feeney observes. We have already seen many examples of social media platforms broadly censoring content about various controversial subjects, often to please government regulators. The Online Safety Bill will make it government policy to tell tech platforms what they can allow people to talk about, using the excuse that children might be exposed to inappropriate content. The bill chills speech, and it should be stopped.
The post Britain Wants To Jail Social Media Managers Who Don't Censor to the Government's Liking appeared first on Reason.com.