NAIROBI, Kenya- After years of resistance, Telegram has taken a major step toward tackling the spread of child sexual abuse material (CSAM).
The messaging app, known for its staunch focus on privacy, announced a partnership with the Internet Watch Foundation (IWF), a globally recognized body fighting online child exploitation.
This decision comes in the wake of mounting criticism and legal pressures, including the arrest of Telegram’s founder, Pavel Durov, in France earlier this year.
Telegram’s shift to collaborate with the IWF marks a significant departure from its previous stance. The app had long refused to join child protection initiatives, earning criticism for inadequate moderation of harmful content.
But pressure intensified after investigative reporting exposed Telegram as a platform enabling not just CSAM, but also drug sales, cybercrime, and fraud.
The arrest of Durov in August at a Paris airport was a turning point. French authorities accused him of failing to cooperate with law enforcement over illicit activities on Telegram.
Since then, the company has announced several changes, including sharing IP addresses and phone numbers of offenders with police, disabling problematic features like “people nearby,” and committing to publishing transparency reports—a standard practice among tech giants but one Telegram had previously resisted.
Derek Ray-Hill, interim CEO of the IWF, described Telegram’s decision as “transformational” but emphasized the road ahead. “This is just the first step in a much longer journey to make the platform safer for everyone,” he said.
For years, Telegram has marketed itself as a bastion of user privacy, boasting encryption features similar to WhatsApp and Signal.
However, critics argue its approach has inadvertently enabled misuse. While Telegram claims to remove hundreds of thousands of pieces of abuse material monthly using internal systems, joining the IWF enhances its ability to block and detect known CSAM.
This move doesn’t just bolster content moderation—it also addresses questions about Telegram’s security.
Despite being promoted as an end-to-end encrypted service, most communication on Telegram actually uses standard encryption, which may be more vulnerable to interception.
Durov, a Russian-born entrepreneur with multiple citizenships, has pledged to overhaul Telegram’s moderation policies, stating he aims to turn the app “from an area of criticism into one of praise.”
Telegram’s newfound commitment to content moderation reflects broader pressures on tech companies to balance privacy with safety.
By partnering with the IWF, Telegram joins the ranks of major platforms taking action against CSAM, setting a precedent for how privacy-focused apps can address harmful content.
The IWF, one of the few organizations authorized to search for and remove child sexual abuse material, provides an evolving database to help online platforms detect and block such content.
This partnership could strengthen Telegram’s tools to prevent abuse while maintaining user trust.
With nearly 950 million users worldwide, Telegram’s shift signals a significant step toward accountability. The app remains especially popular in regions like Russia, Ukraine, and Iran, where its privacy features are highly valued.