Ofcom investigating Telegram over child sexual abuse material concerns
Ofcom investigating Telegram over child sexual abuse material concerns
The UK media regulator has begun examining Telegram following allegations that the messaging app may not be doing enough to stop the spread of child sexual abuse material (CSAM). Ofcom cited evidence indicating that CSAM was being circulated on the platform, prompting its inquiry. According to current legislation, platforms that operate in the UK must implement measures to prevent users from accessing CSAM and other prohibited content, along with systems to address such material—otherwise, they could face substantial penalties.
Telegram’s response to Ofcom’s probe
“Telegram firmly rejects Ofcom’s claims,” stated the company in a release. “Since 2018, we have significantly reduced the public sharing of CSAM through advanced detection tools and collaboration with non-governmental groups.” The firm also expressed surprise at the investigation, suggesting it might be part of a wider campaign targeting platforms that prioritize free speech and privacy.
Broader enforcement actions by Ofcom
This investigation is part of Ofcom’s ongoing effort to ensure compliance with the UK’s comprehensive online safety laws. The regulator has previously targeted smaller platforms for not adequately addressing illegal content, including file-sharing services. However, Cater emphasized that the issue affects larger platforms as well. “Child sexual exploitation causes profound harm, and ensuring platforms combat this is a top priority,” she noted.
Recent research by the NSPCC highlighted that police record approximately 100 CSAM-related offenses daily. The charity applauded Ofcom’s decision, stating it aligns with their mission to protect children. “We support increased action against platforms that allow the circulation of abusive content,” said Rani Govender, the organization’s policy head.
Ofcom’s investigation into Telegram was triggered after the Canadian Centre for Child Protection raised concerns about CSAM presence. The regulator has also launched inquiries into Teen Chat and Chat Avenue over potential grooming risks identified through partnerships with child protection authorities. “Teen-focused chat apps are frequently exploited by predators,” Cater warned. “These companies must demonstrate stronger safeguards or face consequences under the Online Safety Act.”
Online Safety Act’s requirements and penalties
Effective March 2025, the Online Safety Act imposes obligations on user-to-user services to prove they are addressing “priority illegal content,” such as CSAM, terrorism, and extreme pornography. Non-compliance could lead to fines of up to £18 million or 10% of global revenues. While Ofcom has previously penalized several providers, some companies have criticized the regulations. For instance, 4chan recently mocked the fines with hamster-themed memes.
Ofcom confirmed that one file-sharing service it engaged with made notable improvements to meet its content standards. The regulator continues to expand its scrutiny of digital platforms, aiming to hold them accountable for safeguarding users from harmful material.