Freedom of speech issues are unavoidable when you’re running a social network—and they’re often less than clear-cut.
Just ask Mark Zuckerberg, who says he regrets giving in to alleged pressure from the Biden administration to censor Facebook and Instagram content during the COVID pandemic. Writing to U.S. House Judiciary Chair Jim Jordan (R-Ohio), the Meta CEO said the White House had in 2021 “repeatedly pressured our teams for months to censor certain COVID-19 content, including humor and satire” and “we made some choices that, with the benefit of hindsight and new information, we wouldn't make today.”
“I believe the government pressure was wrong, and I regret that we were not more outspoken about it,” Zuckerberg wrote.
Is it right to err on the side of heavy-handedness when moderating content during a raging pandemic that features life-threatening disinformation? There’s no easy answer to that ethical dilemma, but different legal systems would lean different ways. Is a U.S. administration pushing its luck by urging such heavy-handedness, given the constraints of the First Amendment? Maybe; it depends on whether the government tried to legally compel Meta or just to influence it (as it seems was the case).
Either way, Jordan and other House Republicans are calling the letter a “big win for free speech.”
Which brings us back to the subject of Telegram and its CEO Pavel Durov, who has been detained for questioning in France, in relation to the vast amount of criminal activity that takes place on the platform. Elon Musk and many others are characterizing Durov as a free speech martyr, but, as I wrote yesterday, it’s hard to judge those claims without looking at the specific allegations against the Russian-French-Emirati-Saint-Kitts-and-Nevis billionaire.
We have those now, as Paris prosecutors published the potential charges late yesterday (by the end of Wednesday, Durov must either be indicted or released).
There are a dozen potential charges, several of which relate to Durov’s alleged “complicity” in crimes that take place on Telegram: possessing and distributing child sexual abuse material, drug dealing, organized fraud, and what appears to be the distribution of hacking tools.
Similarly to Section 230 of the U.S. Communications Decency Act, EU law gives service providers like Telegram immunity from responsibility for the illegal content on their platforms, as long as they don’t know about it and act quickly to remove it when made aware of it.
Unfortunately for Durov, he is also suspected of complicity in “web-mastering an online platform in order to enable an illegal transaction in [an] organized group.” If true, that would mean he knew about and indeed encouraged the criminal activity on Telegram, making it a feature rather than a bug. He is also suspected of “criminal association with a view to committing a crime” and of laundering organized-crime proceeds.
These very serious potential charges take the case in quite a different direction from what most people would recognize as a free-speech debate. However, the same can’t be said for three items on the prosecutors’ list that say Durov is suspected of breaking French laws around providing and importing cryptographic tools.
A quick reminder: Telegram may pitch itself as a secure messaging service, but its end-to-end encryption system is homegrown; it isn’t on by default; it can’t be activated for Telegram’s popular group chats; and Telegram has never published its server-side code. In the words of cryptography guru Matthew Green, Telegram’s encryption is only “quite possibly” secure.
But the French authorities don’t like that encryption, because Telegram apparently never notified them about it. French law allows the free use of encryption, but demands that importers, exporters, and suppliers of cryptographic systems first notify the French Cybersecurity Agency, at which point they are bound by secrecy about their dealings with the agency. Some argue that this is so the authorities can demand a backdoor in the system, so they can read protected messages, though it’s worth noting that the government has repeatedly failed to win the legal right do so.
The prosecutors’ list also says Durov is suspected of “refusal to communicate, at the request of competent authorities, information or documents necessary for carrying out and operating interceptions allowed by law.” This could be a reference to undermining encryption, though the French legal expert Florence G’sell reads it as being about “refusing to disclose user information that was requested in the course of criminal investigations.”
Encryption is a privacy issue and, by extension, a free-speech issue—if people can’t communicate in private, they don’t feel free to fully express themselves. So this case does touch on free speech. However, as G’sell noted in an X thread, the potential charges around encryption are “relatively minor” and, overall, the case is really about “providing technical means for criminal activities, rather than social media regulations.”
One point in closing: It’s also worth remembering that Telegram was once the target of a ban in Durov’s native Russia, until it wasn’t, and the reasons for the rapprochement between Durov and the Kremlin remain opaque. Many opposition activists in Russia don’t trust Telegram, saying their conversations are monitored by state security.
Is this a free-speech case? Partly—but certainly not as clearly as what Zuckerberg has had to deal with. Is it wise to make Durov the poster boy for the defense of free speech? I wouldn’t bet on it.
More news below.
David Meyer
Want to send thoughts or suggestions to Data Sheet? Drop a line here.