Ofcom investigating Telegram over child sexual abuse material concerns

The UK media regulator has launched an investigation into Telegram over concerns it may be failing to prevent child sexual abuse material (CSAM) being shared.

Ofcom remarked on Tuesday it was probing the popular messaging service after gathering evidence suggesting CSAM was present and being shared on the platform.

Under the current law, user-to-user services operating in the UK must have systems in place to prevent individuals from encountering CSAM and other illegal content, as well as mechanisms to tackle it – or risk huge fines for breaches.

Telegram stated in a statement that it “categorically denies Ofcom’s accusations”.

“Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with [non-governmental organisations],” it told the BBC.

The corporation added: “We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy.”

It is part of a wider crackdown from Ofcom on services it suspects could be flouting the UK’s sweeping online safety requirements – including toughened-up rules for tech firms to tackle CSAM, which it is illegal to possess or share in the UK.

“Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities,” remarked Suzanne Cater, director of enforcement at Ofcom.

She added while there had been progress with tackling CSAM on smaller services, including file-hosting and sharing platforms, the issue “extends to substantial platforms too”.

Children’s charity the NSPCC welcomed Ofcom’s Telegram probe.

“Recent NSPCC research revealed around 100 child sexual abuse image offences are being recorded by police every day,” mentioned Rani Govender, its associate head of policy. This also touches on aspects of global summit.

“The scale of this abuse is stark and we strongly welcome Ofcom ramping up action to tackle it, including opening this investigation into Telegram.”

The probe was also welcomed by the Internet Watch Foundation (IWF), which works to identify and remove CSAM online, including on Telegram.

IWF communications director Emma Hardy commented the organisation shared concerns about “bad actor networks” on the platform, and “that not enough is being done to prevent known, detected, child sexual abuse imagery from being distributed”.

She remarked while the enterprise has taken some action, for these “to be truly effective, they need to do more”.

This, Hardy mentioned, should see safeguards expanded across Telegram, including to chats users can protect with end-to-end encryption.

Broader action

Ofcom stated it launched its probe into Telegram after it was contacted by the Canadian Centre for Child Protection over the alleged presence of and sharing of CSAM on the messaging app.

It commented it had also begun investigations into services Teen Chat and Chat Avenue over potential grooming risks raised through its work with child protection agencies.

“Teen-focused chat services are too easily being used by predators to groom children,” Cater stated.

“These firms must do more to protect children, or face serious consequences under the Online Safety Act.” Furthermore, experts in international relations note the continued relevance.

The firm behind Teen Chat told the BBC it disagreed with Ofcom’s position.

It remarked systems such as active human moderation, illegal content reporting and chat filters made “platform conditions less than ideal for [child sexual abuse and exploitation] to take place on”.

But the business added that while it worked with Ofcom and did its best to prevent illegal activity, “at some point there is so much slight websites like ours can do”.

“We are waiting to hear back from [Ofcom] and will work with them further but we have almost reached the limit of what can be reasonably expected from a platform such as ours,” it noted.

The Act’s illegal content duties, which took effect in March 2025, require so-called user-to-user services like messaging apps and social networks to prove they are tackling “priority illegal content”.

This includes CSAM, terrorism, grooming and extreme pornography.

What the Online Safety Act is – and how to keep children safe online

Ofcom has issued several fines to providers accused of failing to comply with its duties for illegal content or age checks.

It has the power to fine companies £18m or 10% of their global revenues – whichever is higher – where it finds non-compliance.

But its rules and enforcement actions have been met with resistance by some firms.

US message-board 4chan has recently mocked the regulator’s threats of action and fines with hamster memes.

Ofcom mentioned on Tuesday one file, on the other hand-sharing service it contacted with concerns about its systems to deal with illegal content had made “material improvements” to comply with its duties.

Additional reporting by Tamzin Kraftman

Telegram: ‘The dark web in your pocket’

Porn firm fined £1.35m by Ofcom over age check failings

Sign up for our Tech Decoded newsletter to follow the world’s top tech stories and trends. Outside the UK? Sign up here.

AI Disclosure: This article has been generated and curated using advanced AI technology. While we strive for absolute accuracy, some details may be summarized or translated by autonomous systems. Please cross-reference critical financial data with official sources.