The online safety regime is now in force. Ofcom is focused upon making sure regulated services are taking action to drive improvements online.
This is our online safety industry bulletin that highlights key things that services need to know and do, providing links to key publications and guidance. We also explain how we are taking action to drive improvements for users online.
Online safety industry bulletin #1 – 21 January 2025
The UK’s online safety regime is now in force. Sites and apps must act to better protect users online, especially children. As the UK’s online safety regulator, we will provide support and resources to online services to help them comply with their duties and secure their users’ safety. This bulletin sets out the following:
- Priorities for 2025: We want to see meaningful change in online safety in 2025. We set out the areas where we expect rapid improvements, including stronger governance and accountability, and greater safety protections for people online - especially children.
- What services must do: We outline the steps that different sites and apps must take this year to drive the required changes in online safety.
- Support and resources for services: We provide information on our online safety regulations toolkit, which provides assistance with compliance, particularly for smaller organisations. Today we are launching our online guide on how to comply with the illegal content rules, which provides a step-by-step interactive guide to meeting the illegal content rules.
- What we will do to drive improvements: We explain the types of action we plan to take to drive better online safety outcomes, including requesting information, engaging with service providers, and – where there is a serious risk to users – taking formal enforcement action where it’s appropriate and proportionate to do so.
We’ll be regulating a broad range of sites and apps, from the biggest tech firms to small voluntary community forums that have not been regulated before. We recognise service providers – particularly small and medium-sized businesses – will have lots of questions about how to comply and Ofcom’s expectations of them. That’s why we are committed to providing resources and support to help navigate this new regulatory regime, including new online tools.
As part of our outreach programme, we are hosting a 3-day virtual conference, The Online Safety Act Explained. This event will explain the new online safety obligations – and the tools and resources available – in more detail. Once registered, you can drop in and out of the sessions you find useful online. These will include a mix of short practical information sessions and deep dives into topics to support providers on their path to creating a safer life online.
We published our first codes of practice and guidance on preventing illegal content in December, and our industry guidance on highly effective age checks earlier in January. More codes and guidance will follow swiftly this year, including publishing our protection of children’s codes and guidance in April 2025, which will require service providers to do more to protect children online.
Our full implementation roadmap is on our website. While we accept it will take time for the full benefits to users to be realised, we expect services to demonstrate immediate and significant progress. We are monitoring industry’s reactions to the new rules closely, with a particular focus on sites and apps that pose the greatest risk to people in the UK, whether because of their size, or the nature of their service. We expect specific improvements in 2025 in five areas:
- Stronger governance and accountability: Every service provider must effectively assess the risks posed by their service and act quickly to mitigate those risks, including when they make significant changes. We will ask an initial set of firms to disclose their illegal content risk assessments to us by 31 March 2025 so we can check they are suitable and sufficient. This will cover both large services and smaller services that pose particular risks. We will do the same for children’s risk assessments once we’ve published final guidance on this in April. Companies will also need to appoint a senior named individual responsible for safety in their organisation.
- Highly effective age checks to protect children from harm: Service providers must stop children encountering pornography and other content that is harmful to them (for example, suicide, self-harm, and eating disorder content). Highly effective age checks are an important first step in achieving this. Sites and apps that publish their own pornography need to act now to implement highly effective age checks, and all services that provide access to pornographic content must have age checks in place by July 2025. After we’ve finalised our protection of children codes, we’ll ask more companies about their use of age assurance methods, how they have assessed that they are highly effective for protecting children, and how their impact is being monitored.
- Tackling child sexual abuse material and preventing grooming: Sites and apps must take steps to stop the online proliferation of child sexual abuse material and prevent online child grooming. Service providers must mitigate these risks, including through using hash databases and our recommended access controls (where relevant). We’ll be first focusing on making sure higher-risk sites and apps are taking action quickly.
- Effective and properly resourced content moderation to tackle illegal material: Service providers must have a content moderation facility able to quickly take down illegal content. We also expect services to make sure they have the right resources and training in place to operate effective content moderation. This will help prevent all kinds of online harm, including hate, terror, illegal suicide material, fraud, and non-consensual intimate image content. We’ll be assessing the effectiveness of a selection of platforms’ reporting and takedown processes. We also expect firms – where it’s requested – to cooperate with developing a trusted flagger scheme with relevant bodies to report and reduce fraud online.
- A big change in what children see and experience online: Sites and apps likely to be accessed by children will need to demonstrate tangible improvements in children’s online experiences, including stopping algorithms amplifying harmful content. Once we’ve finalised our protection of children codes of practice, we’ll be looking into how service providers test algorithms, including to check that children are not recommended harmful content like suicide and self-harm material.
Sites and apps are legally required to take the following steps in 2025 under the UK’s Online Safety Act to drive the above changes. More information on whether an online service is in scope is available in our ‘regulation checker’.
Protecting all online users from illegal content
- Carry out the illegal content risk assessment now: User-to-user services and search engines that fall within must complete their illegal content risk assessment by 16 March 2025, to assess the risks to users on their service.
- Implement safety measures to mitigate illegal harms: From 17 March 2025, these firms then need to start implementing safety measures to mitigate the risks they identify, and our codes set out measures they can take. Some of these apply to all services, and others to larger and riskier services.
Protecting children from harm online
- Publishers of pornographic material must implement highly effective age checks: Since 17 January, sites that publish their own pornography must implement highly effective age assurance to prevent children from encountering pornography.
- Carry out the children’s access assessment: User-to-user services and search engines that fall within must assess whether their service is likely to be accessed by children, by 16 April 2025.
- Carry out the children’s risk assessment: Organisations that consider their service is likely to be accessed by children are then required to complete a children’s risk assessment. These services will have until July 2025 – three months from when we publish our final guidance in late April 2025 – to carry out the first children’s risk assessment.
- Implement safety measures to mitigate risks to child safety: From July 2025, these organisations will need to put safety measures in place to protect children from the risks they’ve identified, and our final codes – to be published in April 2025 - will set out measures that can be taken.
Responding promptly and constructively to Ofcom’s requests
Throughout 2025, all in scope sites and apps will need to cooperate and comply with any requirements that Ofcom places on them, to help us run the online safety regime effectively. This could include:
Responding to information requests: We can request information from a wide range of organisations to help us fulfil our functions as online safety regulator. We can do so for various reasons, including to monitor compliance and to support our ongoing policy development. Recipients of these requests must respond fully, accurately, and by the deadlines we set. Our requests will be proportionate and have a clearly stated purpose, and we will aim to give reasonable notice of our requests to help with planning. We will usually issue information requests in draft form first. For reference, the table below sets out a forward look at some of our current planned requests (although some timings may change). We will be in contact with firms individually where these requests are relevant:
Planned timings |
Topic of information request |
Service providers affected |
March |
Information to support Ofcom’s assessments of services against the categorisation threshold conditions |
Service providers that may meet one or more of the categorisation threshold conditions proposed by the Secretary of State on 16 December 2024 |
March |
Record of illegal harms risk assessment |
Selected service providers to be notified by Ofcom |
April-May |
Information to support Ofcom’s consultation on fraudulent advertising measures |
Signatories of the Online Fraud Charter and selected service providers to be notified by Ofcom |
July |
Record of children’s risk assessment |
Selected service providers to be notified by Ofcom |
July-August |
Information to support Ofcom’s consultations on additional duties for categorised services. Timing subject to legislative process for categorisation SI and publication of register of categorised services. |
Selected categorised service providers to be notified by Ofcom |
Please note that Ofcom may also use its information gathering powers to make additional requests, such as in response to compliance concerns, or under section 101 of the OSA to support a coroner’s investigation into the death of a child. |
- Responding to transparency notices: Our register of categorised services will confirm which services must comply with duties to produce transparency reports. If a service is on the register and we issue them with a transparency notice, they must comply and produce their report in line with the requirements we set. We will begin issuing draft transparency notices once the register of categorised services is published. These notices will tell services when they must produce their transparency reports.
- Setting up the online safety fees regime: Providers with a qualifying worldwide revenue (QWR) above the threshold that will be set by the DSIT Secretary of State will also be required to notify us of their QWR from the end of 2025, subject to the timings of the relevant secondary legislation. More information is available in our consultation on fees and penalties.
Please note, Video-Sharing Platforms should refer to our repeal page for information about their timings for complying with the Online Safety Act 2023.
We regulate a broad and diverse range of sites and apps under the online safety regime, which vary in size, resources, and business models, including many small or medium-sized enterprises (SMEs). We recognise that navigating these requirements can be challenging, which is why we are developing tools and resources to help services understand their duties and what we expect. For example:
- Service providers can already check if the Online Safety Act applies to them by using our ‘regulation checker’ tool. This provides six questions to decide if a service is in scope.
- We’ve developed our online safety regulations toolkit, which were co-designed and tested with around 50 businesses (mostly small and medium). These are a set of tools designed to be accessible to all services, to help them understand what they need to do to comply. Today we are launching a tool to guide services through the steps needed to comply with the illegal content risk assessment duties, and the linked safety duties and record-keeping and review duties. More tools will be launched in the future.
- We keep our online resources for services updated as duties come into force. This includes more information on actions service providers must take to comply and when.
- Our supervision team can support service providers with queries that relate to their duties and what they should do to comply with the Act. Services can submit enquiries to us online.
- We host regular industry events and webinars to explain our approach to the wide range of service providers we regulate. We are hosting a 3-day virtual conference on 3-5 February 2025, The Online Safety Act Explained, to further outline the duties and deadlines, and guide services through our digital toolkit. Once registered, you can drop in to any session relevant to you. Ofcom also attends conferences, create communication campaigns, and works with partners to raise awareness of Ofcom and ensure services understand their obligations.
We will also continue to coordinate with domestic and international regulators where appropriate, to support firms who need to comply with multiple frameworks in the UK and across borders.
The duties set out in the Act are mandatory – service providers must comply to create a safer life online for people in the UK. Our teams will be monitoring compliance across the range of services in scope and won’t hesitate to take action where this is necessary to protect UK users from harm.
The Act places duties on a wide range of sites and apps – from the largest social media platforms to the smallest online community. The Act is clear that all in-scope service providers need to understand and comply with their duties. But we’ve been clear that as a proportionate, risk-based regulator our focus will be upon the largest services with the highest reach and those smaller services that present specific and high risks to UK users.
The steps we will take to drive compliance include:
- Gathering information: We will use our powers to gather information as described above and in our consultation on information powers. Our notice on Ofcom’s approach to accessing services outlines that we’ll also sometimes gather information by directly accessing services to understand and monitor user experience and consider the measures service providers have in place.
- Supervisory engagement: Our supervision team is engaging with a range of service providers, to ensure they understand their duties and act quickly to protect users online. This includes some of the largest sites and apps as well as smaller ones that present a high risk to users online. We intend to communicate our high-level plans to individual services by April 2025. We will review services’ risk assessments carefully to ensure they have adequately addressed the biggest risks and with a particular focus on the areas identified above. We will work with services to ensure they comply with their duties but will not hesitate to take formal action where they don’t meet our expected standards.
- Opening enforcement programmes: We will use cross-sector enforcement programmes to monitor and assess compliance on specific issues across a range of service providers. These could result in formal enforcement action. On 16 January 2025, we opened an enforcement programme into compliance with duties to protect children from pornographic content through the use of highly effective age assurance, focusing initially on those providers that display or publish pornographic content. We expect to open more programmes in 2025.
- Taking formal enforcement action: Where there is a risk of serious harm to users - especially children - we will take enforcement action where service providers fail to comply with their duties, which may include issuing significant financial penalties, requiring them to make specific changes, and – in exceptionally serious cases – applying to the courts to block sites in the UK. We will generally engage with service providers beforehand to explain our concerns and will generally allow the opportunity to remedy our concerns before moving to formal action. Our Online Safety Enforcement Guidance sets out more information.
The spotlight is now on online services, who must now take action and make sure users in the UK have a safer life online.