Under the Online Safety Act, providers of regulated user-to-user and search services have new duties to keep people safe from illegal harm. Subject to the Codes of Practice completing Parliamentary process, from 17 March 2025, providers will need to take the safety measures set out in the Codes of Practice or use other effective measures to protect users from illegal content and activity.
We've recently published our illegal content Codes of Practice for user-to-user services (PDF,900.5 KB) and illegal content Codes of Practice for search services (PDF,693.99 KB). This page gives a quick introduction to the measures we have recommended in our first codes.
Put measures in place to protect users from online harm
The illegal content safety duties, and those relating to reporting and complaints, focus on keeping people safe online. It’s about making sure you have the right measures in place to protect people from harm that could take place on your service.
If you are the provider of a user-to-user service, it means you will need to:
- take proportionate steps to prevent your users encountering illegal contenmitigate and manage the risk of offences taking place through your service
- mitigate and manage the risks identified in your illegal content risk assessment
- swiftly remove illegal content when you become aware of it, and minimise the time it is present on your service
- explain how you’ll do this in your terms of service
- allow people to easily report illegal content and operate a complaints procedure
If you are the provider of a search service, it means you'll need to:
- take proportionate steps to minimise the risk of your users encountering illegal content via search results
- mitigate and manage the risks identified in your illegal content risk assessment
- explain how you’ll do this in a publicly available statement
- allow people to easily report illegal content and operate a complaints procedure
You can decide for yourself how to meet the specific legal duties; you can apply the measures that apply to your service set out in Ofcom’s Codes of Practice or you can take alternative measures. If you take alternative measures to the ones we recommend, you must also maintain a record of what you have done and how you consider that they fulfil the relevant duties.
Our Codes of Practice set out a range of measures in areas including content moderation, complaints, user access, design features to support and protect users, and the governance and management of online safety risks.
Some measures are targeted at addressing the risk of certain kinds of illegal harms. For example, our Codes of Practice include measures to tackle online grooming. These will mean that, by default, children’s profiles and locations – as well as friends and connections - will not be visible to other users, and non-connected accounts cannot send them direct messages. Children should also receive information to help them make informed decisions around the risks of sharing personal information, and they should not appear in lists of people users might wish to add to their network. This will make it harder for perpetrators of grooming activity to identify and contact vulnerable children.
Other measures help to address a variety of illegal harms such as child sexual abuse material (CSAM) and fraud. Our Codes of Practice set an expectation that high-risk providers use an automated tool called hash matching to detect CSAM, and under the Codes of Practice providers will establish a dedicated reporting channel for organisations with fraud expertise.
What you can do now
Get more familiar with online harms and what makes them more likely by reading our Register of Risks (PDF, 4.65 MB)
Your safety measures will depend on your service
The Act is clear that the safety measures that providers put in place should be proportionate. Different measures in the Codes of Practice would apply to different services based on factors such as:
- the type of service you provide (user-to-user or search);
- the features and functionalities of your service;
- the number of users your service has; and
- the results of your illegal content risk assessment.
Some measures will apply to all services. For example, these include naming a individual accountable for online safety compliance and ensuring your terms of service (or publicly available statements) are clear and accessible.
What you can do now
Take a look at the summary of our decisions (PDF, 306KB) to find out about safety measures and who they apply to.
Some measures apply to large services
Certain measures may apply to services of differing sizes or risk levels, such as the measures to apply specific automated tools to detect and remove child sexual abuse material from user-to-user services and to ensure that users do not encounter it in or via search services.
In our Codes of Practice, we have defined a large service as a service which has an average user base of 7 million or more per month in the UK. This is equivalent to approximately 10% of the UK population. To be considered a user of a user-to-user service for a month, a person doesn’t need to post anything. Just viewing content on a user-to-user service is enough to count as using that service.
These services may need to put in place more measures, such as providing training for staff working in content moderation.
This is because, generally, providers of large services putting in place these measures will have the most benefits for the most number of users – so it’s proportionate to ask them to do more.
What you can do now
Assess whether your service has more than 7 million monthly users on average.
Other measures apply to services that are medium or high risk
When you complete your illegal content risk assessment, we’ll ask you to assess if your service has a negligible, low, medium, or high risk of each kind of illegal content. This rating needs to be as accurate as possible, and we’ve provided guidance on how to assess it.
Once you’ve assessed each risk, some measures apply based on the specific risk of the service:
- If your service is low or negligible risk for all kinds of illegal harm, we propose to call it a ‘low risk service’ and the minimum number of measures will apply.
- If your service is medium or high risk for one kind of illegal harm, we call it a ‘single-risk’ service, and more measures may apply. If your service is medium or high risk for two or more kinds of illegal harm, we define it as a ‘multi-risk’ service, and further measures may apply.
- Some safety measures are focused on specific kinds of illegal harm (like child sexual exploitation and abuse and terrorism offences). These would only apply to services that are medium or high risk for those harms.
What you can do now
Read our quick guide to online safety risk assessments, which introduces our proposed guidance.
Subscribe for updates on online safety
Subscribe for updates on any changes to the regulations and what you need to do. You'll also be the first to know about our new publications and research.