Guide for services: complying with the Online Safety Act

Published: 27 February 2024
Last updated: 16 December 2024

The Online Safety Act makes businesses, and anyone else who operates a wide range of online services, legally responsible for keeping people (especially children) in the UK safe online.

All in-scope services with a significant number of UK users, or targeting the UK market, are covered by the new rules, regardless of where they are based.

The rules apply to services that are made available over the internet (or ‘online services’). This might be a website, app or another type of platform. If you or your business provides an online service, then the rules might apply to you.

Specifically, the rules cover services where:

  • people may encounter content (like images, videos, messages or comments), that has been generated, uploaded or shared by other users. Among other things, this includes private messaging, and services that allow users to upload, generate or share pornographic content. The Act calls these ‘user-to-user services’;
  • people can search other websites or databases (‘search services’); or
  • you or your business publish or display pornographic content.

To give a few examples, a 'user-to-user' service could be:

  • a social media site or app;
  • a photo- or video-sharing service;
  • a chat or instant messaging service, like a dating app; or
  • an online or mobile gaming service.

The rules apply to organisations big and small, from large and well-resourced companies to very small ‘micro-businesses’. They also apply to individuals who run an online service.

It doesn’t matter where you or your business is based. The new rules will apply to you (or your business) if the service you provide has a significant number of users in the UK, or if the UK is a target market.

Check if the Online Safety Act applies to you

Use our tool to find out if the rules are likely to apply to you, and what to do next.

Start now

Check how to comply with the illegal content rules

If the Online Safety Act applies to you, you will need to complete an illegal content risk assessment by 16 March 2025.

You can use our quick guides to check how to comply.

Comply with the protection of children rules

Services likely to be accessed by children will be required to carry out children’s risk assessments from Spring 2025.

Once the new rules are in force, you'll be able to use our tool to help you complete your children’s risk assessment and comply with safety obligations.

Comply with rules about online pornography

If you or your business has an online service that hosts pornographic content, there are rules you will need to follow to prevent children from accessing it.

Subscribe for updates about online safety

Subscribe for updates on any changes to the regulations and what you need to do.

Other important things you should know

If the rules apply to your service, then we will expect you to make sure that the steps you take to keep people in the UK safe are good enough.

While the precise duties vary from service to service, most businesses will need to:

  • assess the risk of harm from illegal content;
  • assess the particular risk of harm to children from harmful content (if children are likely to use your service);
  • take effective steps to manage and mitigate the risks you identify – we have published our illegal content codes of practice you can follow to do this for your illegal content risk assessment;
  • in your terms of service, clearly explain how you will protect users;
  • make it easy for your users to report illegal content, and content harmful to children;
  • make it easy for your users to complain, including when they think their post has been unfairly removed or account blocked; and
  • consider the importance of protecting freedom of expression and the right to privacy when implementing safety measures.

It's up to you to assess the risks on your service, then decide which safety measures you need to take.

To help you, we have published a range of resources – including information about risk and harm online, guidance and codes of practice.

Some businesses will have other duties to meet, so that:

  • people have more choice and control over what they see online; and
  • companies are more transparent and can be held to account for their activities.

We have published codes of practice and guidance outlining the steps that companies can take to comply with these additional duties.

Ofcom is the regulator for online safety. We have a range of powers and duties to implement the new rules and ensure people are better protected online. We have published an updated version of our overall approach and the outcomes we want to achieve.

The Act expects us to help services follow the rules – including by providing guidance and Codes of Practice. These will help you understand how harm can take place online, what factors increase the risks, how you should assess these risks, and what measures you should take in response.

We want to work with you to keep adults and children safe. We’ll provide guidance and resources to help you meet your new duties. These will include particular support for small to medium-sized enterprises (SMEs).

But if we need to, we will take enforcement action if we determine that a business is not meeting its duties – for example, if it isn’t doing enough to protect users from harm.

We have a range of enforcement powers to use in different situations: we will always use them in a proportionate, evidence-based and targeted way. We can direct businesses to take specific steps to come into compliance. We can also fine companies up to £18m, or 10% of their qualifying worldwide revenue (whichever is greater).

In the most severe cases, we can seek a court order imposing “business disruption measures”. This could mean asking a payment or advertising provider to withdraw from the business' service, or asking an internet service provider to limit access.

We can also use our enforcement powers if you fail to respond to a request for information.

You can find more information about out enforcement powers, and how we plan to use them, in our draft enforcement guidance.

The new rules cover any kind of illegal content that can appear online, but the Act includes a list of specific offences that you should consider. These are:

  1. terrorism
  2. child sexual exploitation and abuse (CSEA) offences, including
    1. grooming
    2. image-based child sexual abuse material (CSAM)
    3. CSAM URLs
  3. hate
  4. harassment, stalking, threats and abuse
  5. controlling or coercive behaviour
  6. intimate image abuse
  7. extreme pornography
  8. sexual exploitation of adults
  9. human trafficking
  10. unlawful immigration
  11. fraud and financial offences
  12. proceeds of crime
  13. drugs and psychoactive substances
  14. firearms, knives and other weapons
  15. encouraging or assisting suicide
  16. foreign interference
  17. animal cruelty

Our Register of Risks (PDF, 4.65MB) (organised by each kind of offence) looks at the causes and impact of illegal harm online.

All user-to-user services and search services will need to:

  • carry out an illegal content risk assessment – we have published guidance to help you do this;
  • meet your safety duties on illegal content – this includes removing illegal content, taking proportionate steps to prevent your users encountering it, and managing the risks identified in your risk assessment – our codes of practice will help you do this;
  • record in writing how you are meeting these duties – we have published guidance to help you do this;
  • explain your approach in your terms of service (or publicly available statement);
  • allow your users to report illegal harm and submit complaints.

One way to protect the public and meet your safety duties is to adopt the safety measures we set out in our Codes of Practice. The codes cover a range of measures in areas like content moderation, complaints, user access, design features to support users, and the governance and management of online safety risks.

In our Codes of Practice, we have carefully considered which services each measure should apply to, with some measures only applying to large and/or risky services. The measures in our codes are only recommendations, so you can choose alternatives. But if you do adopt all the recommended measures that are relevant to you, then you will be meeting your safety duties.

When implementing safety measures and policies – including on illegal harm and the protection of children – you will need to consider the importance of protecting users’ privacy and freedom of expression.

Ofcom will consider any risks to these rights when preparing our codes of practice and other guidance, and include appropriate safeguards.

Rate this page

Thank you for your feedback.

We read all feedback but are not able to respond. If you have a specific query you should see other ways to contact us.

Was this page helpful?
Back to top