A French court charged Telegram CEO Pavel Durov with a series of crimes, alleging he was “complicit” in a litany of charges about illegal behavior that took place on the platform. Authorities accused Durov of aiding in drug trafficking, spreading child sexual assault material, and abetting organized crime. Durov is also charged with obstructing the investigation into his company.
The 39-year-old’s arrest is among the first instances in which the CEO of a major online platform is being held personally and criminally liable for the content it hosted. Telegram’s function as a secure, encrypted messaging service combined with the company’s aversion to content moderation made it a breeding ground for the illegal behavior alleged by French prosecutors, legal experts explained.
Other online platforms and social media companies may have faced similar challenges with unsavory, even illegal, activities happening on their platform, but never to the extent that Telegram did, according to Megan Squire, deputy director at the Southern Poverty Law Center who studies online extremism. “Telegram is an outlier in a lot of ways,” she said.
Telegram operates differently than other platforms because it has very little content moderation and a reputation for being uncooperative with law enforcement and watchdogs. Durov has said that child sexual assault material and incitement to violence are prohibited on Telegram.
Online extremist groups “love the lack of content moderation,” said Squire, who uses software to track 30,000 extremist groups on Telegram. “It's kind of a free-for-all there."
Durov’s arrest and the crackdown on Telegram seemed to shatter what was often considered a cornerstone of online regulation: Platforms are not publishers. The idea behind this delineation is meant to protect platforms from being responsible for illegal things that occur on their platform without their knowledge. However, that also means platforms have a responsibility to take measures to stop any lawbreaking once they become aware of it, according to Alan Walter, a technology lawyer at the Paris-based law firm Walter-Billet. That’s where Telegram ran afoul of French authorities.
“If you do not cooperate with French authorities when they are formally requesting that you disclose the information related to some posted content, then you are just obstructing the French police or French legal system,” Walter told Fortune. “So this is a criminal offense, per se.”
Telegram did not respond to a request for comment. In a statement posted to its Telegram channel on Sunday, before Durov was charged, the company said he had "nothing to hide" and that it followed EU law. "It is absurd to claim that a platform or its owner are responsible for abuse of that platform," the company wrote.
For all the challenges other online platforms and social media companies may have had with content moderation, they were rarely as obstinate as Telegram. Durov’s arrest was “due to the specificity of the way Telegram operates,” said Ahmed Baladi, co-chair of the cybersecurity practice of the law firm Gibson Dunn, who works out of the Paris office.
In addition to its lack of content moderation, Telegram was often unresponsive to law enforcement, refusing to acknowledge subpoenas and court orders. Those requests wound up in a dedicated email address that was rarely checked, according to the Wall Street Journal.
Other major platforms such as X and Meta never took as hands-off an approach as Telegram, according to Squire. She attributed that companywide ethos to Durov’s “extreme tech libertarianism.”
“That's different from what we see with some other tech platforms,” Squire said.
Telegram has been somewhat more responsive to pushes from Apple and Google to moderate its content or else risk being removed from their mobile app stores. In 2018, Apple removed Telegram for a day from its App Store over what it said was a failure to curtail child pornography on the app. Telegram then adjusted its content moderation policies.
But even that stands in stark contrast to other platforms that dedicate vast amounts of resources to content moderation, including having teams of real people who sift through some of the most egregious content posted on the sites. In the U.S., social media companies have also shown themselves to be somewhat responsive to government inquiry, testifying before Congress in January.
Those companies all behaved above board, meaning their CEOs were safe from any criminal proceedings, according to Walter. “I don't see Zuckerberg in custody next month,” he said.