Telegram has made significant changes to its moderation policies after the arrest of CEO Pavel Durov in France for allegedly failing to control illegal activity on the platform. The messaging app, long known for its lax approach to content moderation, has now extended the reach of its moderators to include private chats, allowing users to flag illegal content for review. This marks a notable shift from the app’s previous stance, which largely treated private groups and chats as off-limits.
The policy change, quietly updated on Telegram’s FAQ page, comes in the wake of mounting pressure from French authorities, who allege that the platform has become a haven for criminal activity. Durov was arrested in France last month on charges related to allowing illegal content to proliferate on the app. He has denied the accusations, calling them unfounded, but remains in the country awaiting trial.
In a Telegram post earlier this week, Durov acknowledged that the app’s rapid expansion has “made it easier for criminals to abuse our platform,” and he vowed to introduce reforms to better police content. The new moderation policy could reshape Telegram’s reputation, as it has often been criticized for allowing all types of content—including illegal activities—without intervention.
By expanding moderation to private chats, Telegram is signaling a major shift in its operations, potentially addressing concerns from regulators and critics who have long viewed the platform as difficult to regulate. The changes reflect the growing scrutiny over how messaging apps handle illicit activity in the wake of regulatory crackdowns, particularly in Europe.
Despite Durov’s legal troubles, Telegram remains one of the most popular messaging platforms worldwide, with over 700 million users. The new rules may mark a turning point in how the app manages its vast user base while addressing increasing regulatory demands.