The arrest of the Telegram CEO Pavel Durov in France this week is extremely significant. It confirms that we are deep into the second crypto war, where governments are systematically seeking to prosecute developers of digital encryption tools because encryption frustrates state surveillance and control. While the first crypto war in the 1990s was led by the United States, this one is led jointly by the European Union — now its own regulatory superpower.
What these governments are insisting on, one criminal case at a time, is no less than unfettered surveillance over our entire digital lives.
Durov, a former Russian, now French citizen, was arrested in Paris on Saturday, and has now been indicted. You can read the French accusations here. They include complicity in drug possession and sale, fraud, child pornography and money laundering. These are extremely serious crimes — but note that the charge is complicity, not participation. The meaning of that word “complicity” seems to be revealed by the last three charges: Telegram has been providing users a “cryptology tool” unauthorised by French regulators.
In other words, the French claim is that Durov developed a tool — a chat program that allowed users to turn on some privacy features — used by millions of people, and some small fraction of those millions used the tool for evil purposes. Durov is therefore complicit in that evil, not just morally but legally. This is an incredibly radical position. It is a charge we could lay at almost every piece of digital infrastructure that has been developed over the past half century, from Cloudflare to Microsoft Word to TCP/IP.
There have been suggestions (for example by the “disinformation analysts” cited by The New York Times this week) that Telegram’s lack of “content moderation” is the issue. There are enormous practical difficulties with having humans or even AI effectively moderate millions of private and small group chats. But the implication here seems to be that we ought to accept — even expect — that our devices and software are built for surveillance and control from the ground up: both the “responsible technology” crowd and law enforcement believe there ought to be a cop in every conversation.
It is true that Telegram has not always been a good actor in the privacy space, denigrating genuinely secure-by-design platforms like Signal while granting its own users only limited privacy protection. Telegram chats are not fully or always encrypted, which leaves users exposed to both state surveillance and non-state criminals. Wired magazine has documented how the Russian government has been able to track users down for their apparently private Telegram conversations. For that matter, it would not be surprising to learn that there are complex geopolitical games going on here between France and Russia.
But it would be easier to dismiss the claims made against Durov as particular to Telegram, or dependent on some specific action of Durov as an individual, if he was alone in being targeted as an accomplice for criminal acts simply because he developed privacy features for the digital economy.
The Netherlands have imprisoned the developer Alexey Pertsev for being responsible for the malicious use of a cryptocurrency privacy tool he developed, Tornado Cash. Again, Pertsev was not laundering money; he built a tool to protect every user’s privacy. The United States has arrested the developers of a Bitcoin privacy product, Samourai Wallet, also for facilitating money laundering.
The arrest of Durov suggests that the law enforcement dragnet is being widened from private financial transactions to private speech. If it is a crime to build privacy tools, there will be no privacy.
Should developers of platforms like Telegram be held responsible for how users use them? Let us know your thoughts by writing to letters@crikey.com.au. Please include your full name to be considered for publication. We reserve the right to edit for length and clarity.