Government Surveillance: The Curious Case of Pavel Durov

  • 13 Jun 2025
  • 3 Mins Read
  • 〜 by Anne Ndungu

In August 2024, the Telegram founder Pavel Durov arrived at Paris’s Le Bourget Airport,  en route to Finland, only to be met by French police. He was arrested on the spot, and held in solitary confinement for four days. His detention in a small, windowless cell with a thin mattress and constant overhead light made headlines worldwide. All his devices were seized, and he was only allowed calls to an assistant to secure legal help. Over 10 million people signed a petition for his release, although major human rights groups remained publicly silent.

French authorities accused Telegram of failing to answer judicial requests and thus facilitating serious crimes, child pornography, drug trafficking, and organised crime, by letting offenders use the encrypted platform. This is illustrative of a new trend to partially blame tech founders for the crimes committed on their platforms.  Durov, however, insists that France never served a binding court order until after his detention.

Since 2023, Telegram’s Belgium-based team has processed all valid Digital Services Act (DSA) requests, handing over only IP addresses and phone numbers under a judge’s signature. Private messages remain end-to-end encrypted, and Telegram cannot and does not, offer backdoors to any government.

Durov holds Russian, French, Saint Kitts and Nevis, and Emirati passports. He became a naturalised French citizen in August 2021 under the “Foreign Emeritus” programme, which honours non-residents who boost France’s global standing. Now, he is under “judicial control,”  and cannot freely leave France. He is allowed only supervised trips (e.g., to Dubai) and was recently denied entry to the U.S., while investigations continue. 

Durov’s case raises questions about state power and platform privacy, the misuse of encryption backdoor laws, and the precedent of personally prosecuting tech founders for user-generated content. Governments worldwide push for “backdoors” into encrypted apps. France even considered banning end-to-end encryption last year. Such measures would weaken overall security, exposing every user, law-abiding or not, while criminals simply switch to other tools.

According to Durov, Telegram applies uniform moderation globally: content is removed or restricted only when a valid court order demands it. There is no special takedown list by country, nor pre-emptive political censorship. Channels appear only if users deliberately subscribe, avoiding algorithmic steering. Since his arrest, Telegram has enhanced its moderation practices, removing features like “people nearby”, improving content responses, and legally sharing data with law enforcement.

Durov recently alleged that the head of France’s Dynamic Stochastic General Equilibrium (DGSE) foreign intelligence Nicolas Lerner, pressured him to censor Telegram channels, specifically targeting Romanian conservative voices, but he refused. The DGSE denies the politicised intent, stating their engagement was to uphold responsibilities concerning terrorism and child abuse prevention. 

Despite all this, German court filings have revealed that French intelligence has access to Telegram’s internal data, indicating an unprecedented level of surveillance involvement. Secret filings in Berlin’s courts revealed how French intelligence (DGSE) tapped Telegram’s infrastructure across Europe. France used Europe’s mutual legal assistance treaties (MLATs) to secure “read-only” API tokens to Telegram’s EU servers. These tokens allowed agents to access subscriber lists, query IP logs, and cross-reference SIM-registration metadata, without touching encrypted message content.

Now, German judges demand full transparency and proportionality: any foreign-sourced data must have clear court orders and a documented chain of custody, or it won’t be admitted. This sets a high bar for any government seeking covert access to encrypted platforms.

As quantum computers advance and the artificial intelligence (AI) use grows, encryption methods must evolve (quantum-secure algorithms).

Yet the biggest privacy risk today comes from undisclosed “zero-day” exploits (e.g., Pegasus spyware) that can infect devices without user knowledge.

Durov says he rarely uses a phone. He has no SIM card, and minimal mobile use reduces distraction and vulnerability, preferring laptops or tablets for work.

The conclusion of this case will influence future rules on digital privacy, state surveillance, and the liability of tech executives worldwide. 

(Material for this article was sourced from Pavel Durov’s interview with Tucker Carlson, which premiered on June 9, 2025.)