Countdown to a safer life online HERO (1336 × 560px)

Countdown to a safer life online

Published: 17 October 2024
  • Ofcom provides progress update on implementing the UK’s Online Safety Act
  • From December, tech firms must start to take action

Two months out from online safety laws coming into force, Ofcom has warned tech firms they could face enforcement action if they don’t comply with new duties when the time comes. 

Today, Ofcom has provided an update on its progress in implementing the Online Safety Act and set out what to expect over the next year.

Implementing the Act

The Online Safety Act was passed in October 2023. When fully in force, it will place new legal duties on platforms available in the UK. Before we can enforce these duties, we are required to consult publicly on codes of practice and guidance.

In the space of six months, we consulted on our codes and guidance for illegal harms, pornography age verification and children’s safety, and submitted our advice to Government on the thresholds that would determine which services will be ‘categorised’ and subject to additional duties.[1]

We have also been speaking to many tech firms – including some of the largest platforms as well as smaller ones – about what they do now and what they will need to do next year.

Change is already happening…

Ofcom has already secured better protections from UK-based video-sharing platforms, including OnlyFans and other adult sites introducing age verification; BitChute improving its content moderation and user reporting; and Twitch introducing measures to stop children seeing harmful videos.

Meta and Snapchat have made changes that we proposed in our illegal harms consultation to protect children from grooming. These include Instagram, Facebook and Snapchat introducing changes to help prevent children being contacted by strangers; and Instagram’s ‘Teen Accounts’ to limit who can contact teens and what they can see.

These are positive steps, but many platforms will have to do far more when the Online Safety Act comes into force.

…but more to do next year

Parliament set us a deadline of April 2025 to finalise our codes and guidance on illegal harms and children’s safety. We will finalise our illegal harms codes and guidance ahead of this deadline. Our expected timing for key milestones over the next year – which could change – include:[2]

  • December 2024: Ofcom will publish first edition illegal harms codes and guidance. Platforms will have three months to complete illegal harms risk assessment.
  • January 2025: Ofcom will finalise children’s access assessment guidance and guidance for pornography providers on age assurance. Platforms will have three months to assess whether their service is likely to be accessed by children.
  • February 2025: Ofcom will consult on best practice guidance on protecting women and girls online, earlier than previously planned.
  • March 2025: Platforms must complete their illegal harms risk assessments, and implement appropriate safety measures.
  • April 2025: Platforms must complete children’s access assessments. Ofcom to finalise children’s safety codes and guidance. Companies will have three months to complete children’s risk assessment.
  • Spring 2025: Ofcom will consult on additional measures for second edition codes and guidance.
  • July 2025: Platforms must complete children’s risk assessments, and make sure they implement appropriate safety measures.

We will review selected risk assessments to ensure they are suitable and sufficient, in line with our guidance, and seek improvements where we believe firms have not adequately mitigated the risks they face.

Ready to take enforcement action

Ofcom has the power to take enforcement action against platforms that fail to comply with their new duties, including imposing significant fines where appropriate. In the most serious cases, Ofcom will be able to seek a court order to block access to a service in the UK, or limit its access to payment providers or advertisers.

We are prepared to take strong action if tech firms fail to put in place the measures that will be most impactful in protecting users, especially children, from serious harms such as those relating to child sexual abuse, pornography and fraud.

The time for talk is over. From December, tech firms will be legally required to start taking action, meaning 2025 will be a pivotal year in creating a safer life online.

We’ve already engaged constructively with some platforms and seen positive changes ahead of time, but our expectations are going to be high, and we’ll be coming down hard on those who fall short.

- Dame Melanie Dawes, Ofcom’s Chief Executive

Notes to editors

  1. Categorised services will have extra obligations under the Act, such as giving users more tools to control what they see, ensuring protections for news publisher and journalistic content, preventing fraudulent advertising, producing transparency reports, and consistently applying their terms of service. At the time of writing, we await Government confirmation of the thresholds for categorisation. Once these are confirmed, we will prepare the register of categorised services. We expect the first transparency reports to be published by services in 2025 and other duties to come into force in 2026.
  2. Expected timetable for implementing the Online Safety Act:

Expected timetable for implementing the Online Safety Act

Related content

Back to top