Ofcom Publishes First Online Safety Act Codes of Practice

Prefer to listen to our podcast on this topic? Please play the link below.

Social media apps


The UK’s online safety regulator, Ofcom, has taken a significant step forward in enforcing the Online Safety Act by publishing its first Codes of Practice to tackle illegal online harms. These new regulations set out clear expectations for online platforms, search engines, and other digital services to identify and remove illegal content proactively.

Online service providers now have a deadline of March 16, 2025, to conduct risk assessments and must begin implementing the required safety measures immediately after. Those who fail to comply could face huge fines (up to 10% of global revenue or £18 million, whichever is higher) or even site-blocking orders in the UK.

What Do the New Online Safety Codes Require?

Ofcom’s new guidance outlines a range of responsibilities that platforms must meet, including:

🔹 Senior Accountability – Companies must appoint a senior executive responsible for ensuring compliance.
🔹 Stronger Content Moderation – Platforms must properly fund and staff moderation teams.
🔹 Proactive Risk Assessments – By March 2025, all providers must assess the risks of illegal content on their platforms.
🔹 Better Reporting Mechanisms – Platforms must provide easily accessible tools for users to report illegal content.
🔹 Stricter Action Against Criminal Content – Companies must use automated detection tools to remove child sexual abuse material and take down terrorist-related accounts.
🔹 Increased User Controls – Larger platforms must provide tools for users to block or mute accounts and disable comments.

This represents a major shift from the previously unregulated online space, putting greater responsibility on platforms to make the internet safer.

The Government and Ofcom’s Perspective

The government has welcomed these rules as a “material step change in online safety”, emphasizing that tech companies must now take proactive action.

Technology Secretary Peter Kyle stated:

“This government is determined to build a safer online world… These laws mark a fundamental reset in society’s expectations of technology companies. I expect them to deliver and will be watching closely to make sure they do.”

Ofcom’s Chief Executive, Melanie Dawes, echoed this, stating:

“For too long, sites and apps have been unregulated, unaccountable, and unwilling to prioritise people’s safety over profits. That changes from today.”

Concerns: Do the Codes Go Far Enough?

Despite the progress, campaigners have expressed disappointment that the initial codes do not fully tackle harmful content.

🔸 Lack of Action on Self-Harm and Suicide Content – The Molly Rose Foundation criticized the measures for failing to address suicide-related content effectively, arguing that this delay puts lives at risk.

🔸 Concerns Over Encrypted Messaging – There is ongoing controversy over whether encrypted messaging services (such as WhatsApp and Signal) should be required to scan private messages for harmful content, which some argue could undermine privacy for all users.

🔸 Slow Implementation Timeline – Some legal experts worry that 14 months after the Online Safety Act became law, these codes are only the first step, with further consultations expected in 2025.

How This Links to the OSA’s Training and Awareness Work

At the Online Safety Alliance (OSA), we firmly believe that regulation alone is not enough—education is a critical component of online safety.

For several years, our training for staff and students in schools has been raising awareness of online risks, ensuring that young people and educators:

✅ Understand the signs of online grooming, exploitation, and harmful content
✅ Know how to report illegal or harmful material
✅ Feel empowered to navigate the online world safely

With the new Ofcom rules putting pressure on tech companies, the role of education is more important than ever—young people must know how to identify risks, protect themselves, and seek help when needed.

What Happens Next?

🔹 March 16, 2025 – The deadline for online platforms to complete risk assessments.
🔹 March 17, 2025 – Companies must begin implementing Ofcom’s safety measures.
🔹 Spring 2025 – Further consultations on child safety, self-harm content, and crisis response protocols.

As these changes take effect, all eyes will be on Ofcom’s enforcement and whether tech firms comply. Will these new rules truly make the internet safer, or are further reforms needed?

At the OSA, we will continue to provide guidance, training, and resources to help schools, educators, and young people stay informed and protected in an ever-evolving digital world.

Sources:

  • “Ofcom publishes Illegal Harms Codes of Practice | Computer Weekly”
  • “Ofcom publishes first set of new online safety rules”
  • “Ofcom releases first codes of practice ahead of Online Safety Act – UKTN”
  • “Q&A: Ofcom, the Online Safety Act, and codes of practice for social media | The Standard”
  • “Social media platforms have work to do to comply with Online Safety Act, says Ofcom | Social media | The Guardian”