Telegram has announced it will begin providing users’ IP addresses and phone numbers to authorities who present search warrants or valid legal requests. The update to its privacy policy is intended to “discourage criminals,” according to CEO Pavel Durov, who made the announcement in a post on the platform.
Durov emphasized that the move targets only a small fraction of users involved in illicit activities, stating, “While 99.999% of Telegram users have nothing to do with crime, the 0.001% involved in illicit activities create a bad image for the entire platform, putting the interests of our almost billion users at risk.”
This marks a significant shift for Durov, the Russian-born co-founder of Telegram, who was detained last month in France and later charged with enabling criminal activities on the platform, including the spread of child abuse images and drug trafficking. He was also charged with failing to comply with law enforcement. Durov denies the charges, calling his arrest “surprising” and “misguided.”
Telegram has faced mounting criticism for its handling of illegal content, including child pornography, misinformation, and terror-related material, exacerbated by its feature allowing groups of up to 200,000 members, far larger than Meta-owned WhatsApp’s 1,000-member limit.
The platform recently came under fire for hosting far-right channels that contributed to violence in English cities. In a related move, Ukraine banned Telegram on state-issued devices due to concerns over Russian influence.
The arrest of Durov has sparked debate about free-speech protections on the internet, particularly regarding whether Telegram remains a safe space for political dissidents. John Scott-Railton, a senior researcher at the University of Toronto’s Citizen Lab, noted that the policy shift is causing alarm in regions like Russia, Belarus, and the Middle East, where Telegram has been a platform for people to share political views.
Critics are questioning whether Telegram will now cooperate with repressive regimes, though the company has not clarified how it will handle such demands in the future.
Cybersecurity experts point out that while Telegram has previously removed certain groups, its moderation of extremist and illegal content remains far weaker than other social media platforms. Prior to the policy change, Telegram reportedly only provided information on terror suspects.
Durov stated that Telegram is now using a “dedicated team of moderators” and artificial intelligence to conceal problematic content in search results. However, experts like Daphne Keller from Stanford University’s Center for Internet and Society warn that merely hiding such content may not meet legal requirements in Europe and elsewhere. She argued that Telegram needs to remove illegal material entirely and notify authorities, particularly in cases of child sexual abuse content.
Keller also questioned whether Telegram’s new policy would satisfy law enforcement agencies seeking deeper insights into users’ communications and activities. “It sounds like a commitment that is likely less than what law enforcement wants,” she said.