Photo: Centropy

Ofcom cracks down on harmful content with new safety regulations to protect children online

Ofcom has released the final version of its landmark online safety regulations, heralding what it calls ‘transformational new protections’ for

Facebook
LinkedIn
X

Ofcom has released the final version of its landmark online safety regulations, heralding what it calls ‘transformational new protections’ for children navigating the digital world.

These measures, which form part of the broader Online Safety Act, are intended to significantly reduce children’s exposure to harmful online content.

Under the new rules, platforms must make critical changes by 25 July 2025, including implementing robust age verification checks and adapting algorithms to filter out content that may be harmful to children. Sites that host explicit content that encourages self-harm, suicide, or eating disorders are among those required to take stronger action to restrict access by under-18s.

Failure to comply could result in substantial fines or, in extreme cases, court orders to block access to services in the UK.

John Lucey, VP EMEA North of Cellebrite, commented: “Ofcom’s new legislation for child safety is a vital step to protect our society from harmful content being disseminated by malicious perpetrators, demanding more robust protections from websites and platforms to screen their content. With the ongoing increase of digital content, organisations need to get out in front of this issue and ensure that their platforms are safe and protected.”

“Technology has a transformative role to play in child safety, helping to analyse, investigate and identify suspicious behaviour and malicious content through data. Our Pathfinder tool, for example, reveals the connections between data to streamline casework and enable the efficient analysis of crucial information to advance investigations.”

“This is vital to eradicate harmful content on the internet and digital devices, protecting children, but also the investigators hunting down these perpetrators by identifying and filtering harmful materials.”

The codes mandate over 40 practical measures aimed at enhancing online safety for children. These include adjusting algorithms to limit exposure to harmful content, enforcing robust age verification for restricted material, and swiftly removing harmful content.

Platforms must also simplify terms of service for younger users, offer opt-outs from potentially unsafe group chats, and ensure accessible support for children who encounter distressing content. Additionally, they are required to appoint a named individual responsible for children’s safety and conduct annual reviews of child safety risks.

Reactions to the announcement have been mixed. The NSPCC welcomed the move as “a pivotal moment for children’s safety online,” while also urging further action—especially around encrypted private messaging, which remains difficult to monitor.

Dame Melanie Dawes, Chief Executive of Ofcom, described the introduction of the new Codes as a “gamechanger,” underscoring that the protections are legally binding. “If companies want to serve the British public—particularly young people—they must fundamentally change how their services operate,” she said in an interview on BBC Radio 4’s Today programme.

As the new guidelines await parliamentary approval, they mark a defining moment in the UK’s effort to make the internet a safer space for children—a commitment that will now be enforceable by law.

Related Stories from Silicon Scotland

Leading MSP warns of the increasing risks of ransomware and data theft
“Security teams can’t keep up with AI” warns industry chief at GITEX 2025 event
1.2 billion Facebook user records allegedly leaked
UK Cybersecurity vacancies set to grow by 12% each year
Coinbase’s $20 million mistake
M&S hit by £300 million loss after cyber-attack disrupts online services

Other Stories from Silicon Scotland