Community Radio Fund Round 2 202425 HERO (1336 × 560px)

Enforcing the Online Safety Act: Platforms must start tackling illegal material from today

Published: 17 March 2025
  • As deadline passes to carry out illegal harms risk assessments, sites and apps must now start tackling criminal content
  • Ofcom launches enforcement programme into child sexual abuse imagery on file-sharing services

From today, online platforms must start putting in place measures to protect people in the UK from criminal activity, while Ofcom has launched its latest enforcement programme to assess industry compliance. 

Providers of services in scope of the UK’s Online Safety Act had until yesterday (16 March) to carry out a suitable and sufficient illegal harms risk assessment – to understand how likely it is that users could encounter illegal content on their service, or, in the case of ‘user-to-user’ services, how they could be used to commit or facilitate certain criminal offences.[1]

From today, the next set of illegal harms duties come into force. That means platforms now have to start implementing appropriate measures to remove illegal material quickly when they become aware of it, and to reduce the risk of ‘priority’ criminal content from appearing in the first place.[2]

In the coming weeks and months, we will be assessing platforms’ compliance with their new illegal harms obligations under the Act, and launching targeted enforcement action where we uncover concerns.[3]

Given the acute harm caused by the spread of online child sexual abuse material (CSAM), assessing providers’ compliance with their safety duties in this area has been identified as one of our early priorities for enforcement.

Tackling CSAM on file-sharing services

Our evidence shows that file-sharing and file-storage services are particularly susceptible to being used for the sharing of image-based CSAM.

Among the 40 safety measures set out in our illegal harms codes of practice, we recommend, for example, that certain services – including all file-sharing services at high risk of hosting CSAM, regardless of size – use automated moderation technology, including ‘perceptual hash-matching’, to assess whether content is CSAM and, if so, to swiftly take it down.[4]

Today, we have launched an enforcement programme to assess the safety measures being taken, or that will soon be taken, by file-sharing and file-storage providers to prevent offenders from disseminating CSAM on their services.

We have written to a number of these services to put them on notice that we will shortly be sending them formal information requests regarding the measures they have in place, or will soon have in place, to tackle CSAM, and requiring them to submit their illegal harms risk assessments to us.

If any platform does not engage with us or come into compliance, we will not hesitate to open investigations into individual services. We have strong enforcement powers at our disposal, including being able to issue fines of up to 10% of turnover or £18m – whichever is greater – or to apply to a court to block a site in the UK in the most serious cases.

Working with child protection experts

Our preliminary supervision activity has involved working closely with law enforcement agencies and other organisations – including the Internet Watch Foundation (IWF), the Canadian Centre for Child Protection (C3P) and the National Centre for Missing and Exploited Children (NCMEC) – to identify file-sharing and file-storage services at highest risk of hosting image-based CSAM.

In recent months, we have been engaging with the largest file-sharing and file-storage services about their obligations under the Act. Additionally, our taskforce dedicated to driving compliance with small but risky services has identified and engaged with providers of smaller file-sharing and file-storage services to assess whether they are already taking appropriate measures.

Other ongoing enforcement activity

Today’s CSAM enforcement programme represents the third opened by Ofcom as online safety regulator since the start of this year. In January, we opened an enforcement programme into age assurance measures in the adult sector.

Two weeks ago, we issued formal information requests to providers of a number of services setting them a deadline of 31 March by which to submit their illegal harms risk assessments to us.

We expect to make additional announcements on formal enforcement action over the coming weeks.

Child sexual abuse is utterly sickening and file storage and sharing services are too often used to share this horrific material. Ofcom’s first priority is to make sure that sites and apps take the necessary steps to stop it being hosted or shared.

Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that. But, make no mistake, any provider who fails to introduce the necessary protections can expect to face the full force of our enforcement action.

- Suzanne Cater, Enforcement Director at Ofcom

It is so important that the fight against child sexual abuse is being prioritised and we are greatly encouraged to see the determination to ensure the abuse we are seeing today does not continue to spiral.

We stand ready to work alongside Ofcom as it enforces the Online Safety Act, and to help companies to do everything they can to comply with the new duties. We have been at the forefront of the fight against online child sexual abuse for nearly three decades, and our tools, tech, and data are cutting edge.

The Online Safety Act has the potential to be transformational in protecting children from online exploitation. Now is the time for online platforms to join the fight and make sure they are doing everything they can to stop the spread of this dangerous and devastating material.

- Derek Ray-Hill, Interim CEO at the Internet Watch Foundation

END

Notes to editors:

1. Services in scope of the Online Safety Act include search engines and ‘user-to-user’ services with a significant number of UK users, or targeting the UK market. User-to-user services are where people may encounter content – including images, videos, messages or comments – that has been generated, uploaded or shared by other users. For example, social media or video-sharing platforms, messaging, gaming and dating apps, forums and file-sharing sites.

2. The Online Safety Act lists over 130 ‘priority offences’, and tech firms must assess and mitigate the risk of these occurring on their platforms. The priority offences can be split into 17 categories:

  • Terrorism
  • Harassment, stalking, threats and abuse offences
  • Coercive and controlling behaviour
  • Hate offences
  • Intimate image abuse
  • Extreme pornography
  • Child sexual exploitation and abuse
  • Sexual exploitation of adults
  • Unlawful immigration
  • Human trafficking
  • Fraud and financial offences
  • Proceeds of crime
  • Assisting or encouraging suicide
  • Drugs and psychoactive substances
  • Weapons offences (knives, firearms, and other weapons)
  • Foreign interference
  • Animal welfare

3. We are initially prioritising the compliance of sites and apps that may present particular risks of harm from illegal content due to their size or nature – for example because they have a large number of users in the UK, or because their users may risk encountering some of the most harmful forms of online content and conduct. When we previously set out our approach to implementing the Online Safety Act, we identified eight targets for immediate action, based on where harm is greatest and where we know there are clear steps services can take:

  • Services prioritise user safety, complete effective risk assessments, tackle risks to users, and name senior accountable people.
  • Children are protected from harmful content and activity, including suicide and self-harm material and pornography.
  • Offenders can’t share child sexual abuse content and children don’t face unsafe contact.
  • Illegal content is taken down quickly.
  • Women and girls face less gendered harm and abuse online.
  • Online fraud is reduced.
  • Users – especially children – are empowered to have control over their online experience.
  • More transparency in how platforms keep users safe, and what Ofcom is doing to deliver positive change.

4. Hash-matching is a process that detects content previously identified as illegal or violative. ‘Hashing’ is an umbrella term for techniques used to create fingerprints of content, which can then be stored in a database. These can be used by providers, who generate hashes of the content on their service and compare those against the hashes in the database to test whether any uploaded content is a ‘match’ for those images. There are several types of hash-matching. Ofcom’s codes of practice recommend the use of ‘perceptual’ over ‘cryptographic’ hash-matching, to allow for more harmful content to be identified and potentially moderated. Perceptual hash-matching aims to identify images that are similar to images of known CSAM, whereas cryptographic hash-matching identifies identical images. In practice, perceptual hash-matching is therefore more likely to detect a larger amount of CSAM compared to other forms of hash-matching.

Back to top