A person using a phone with a padlock graphic overlaid

Enforcing the Online Safety Act: Scrutinising illegal harms risk assessments

Published: 3 March 2025
  • Ofcom launches enforcement programme to assess industry compliance with first set of duties under UK online safety laws
  • Certain large services as well as small but risky sites must submit illegal harms risk assessments to the regulator by 31 March
  • Ofcom warns platforms that any failure to provide a sufficient response, on time, could result in enforcement action

The providers of a number of online services – including large platforms and smaller sites that may present particular risks to users – have been required by Ofcom to submit their illegal harms risk assessments by 31 March, failing which they may face enforcement action.

Risk assessments are fundamental to keeping users safer online. In order to put in place appropriate safety measures to protect people, especially children, providers must first understand how harm could take place on their platforms, and how their user-base, features and other characteristics could increase those risks of harm.

The first set of duties on sites and apps in scope of the UK’s Online Safety Act came into force when Ofcom published its illegal harms codes of practice and guidance on 16 December. From that point, providers had three months to carry out a ‘suitable and sufficient’ illegal content risk assessment, in line with Ofcom’s guidance.[1]

Specifically, they must determine how likely it is that users could encounter illegal content on their service, or, in the case of user-to-user services, how they could be used to commit or facilitate certain criminal offences. Providers must also make and keep a written record of their risk assessment, including details about how it was carried out and its findings.[2]

Ready to enforce against non-compliance

To assess and monitor industry compliance with these illegal content risk assessment duties under the Act, Ofcom has today launched an enforcement programme.

One of our first priorities is to scrutinise the compliance of sites and apps that may present particular risks of harm from illegal content due to their size or nature – for example because they have a large number of users in the UK, or because their users may risk encountering some of the most harmful forms of online content and conduct, including child sexual exploitation and abuse, terrorism, hate crimes, content encouraging or assisting suicide, and fraud.

Ofcom has therefore issued formal information requests to providers of a number of services today – including the largest social media platforms, as well as smaller but risky sites. This request sets them a deadline of 31 March by which to submit their records of their illegal harms risk assessments to us.

We will use the responses we receive to identify gaps in risk assessments and drive improvements. We will also use them to further our policy work to develop new measures for our codes of practice.  

Providers are required, by law, to respond to any statutory request for information by Ofcom in an accurate, complete and timely way. If any platform does not provide us with a satisfactory response by the deadline, we will not hesitate to open investigations into individual service providers.

We have strong enforcement powers at our disposal, including being able to issue fines of up to 10% of turnover or £18m – whichever is greater – or to apply to a court to block a site in the UK in the most serious cases.

Regardless of size or location, all services in scope of the Online Safety Act must carry out a proper illegal harms risk assessment – a vital first step in protecting their users and making their platforms safer by design.

We’ve identified a number of online services that may present particular risks of harm to UK users from illegal content – including large platforms as well as smaller sites – and are requiring them to provide their illegal harms risk assessment to us this month. We’re ready to take swift action against any provider who fails to comply.

- Suzanne Cater, Enforcement Director at Ofcom

In January, we also opened an enforcement programme into age assurance measures that providers of pornographic content are implementing, and we will be launching further action in other areas in the coming weeks and months.

END

Notes to editors:

    1. The Online Safety Act lists over 130 ‘priority offences’, and tech firms must assess and mitigate the risk of these occurring on their platforms. The priority offences can be split into 17 categories:
      • Terrorism
      • Harassment, stalking, threats and abuse offences
      • Coercive and controlling behaviour
      • Hate offences
      • Intimate image abuse
      • Extreme pornography
      • Child sexual exploitation and abuse
      • Sexual exploitation of adults
      • Unlawful immigration
      • Human trafficking
      • Fraud and financial offences
      • Proceeds of crime
      • Assisting or encouraging suicide
      • Drugs and psychoactive substances
      • Weapons offences (knives, firearms, and other weapons)
      • Foreign interference
      • Animal welfare
    1. Ofcom has published Record-Keeping and Review Guidance to assist providers in meeting their record-keeping and review duties.
    Back to top