Under the Online Safety Act, services that are likely to be accessed by children have new duties to comply with to protect children online. One way they can do that is to adopt the safety measures in Ofcom’s codes of practices.
We’re currently consulting on our draft Children’s Safety Codes. This page gives a quick introduction to the safety measures we have proposed.
Before reading this quick guide on our draft Children’s Safety Codes you it would be helpful to read our:
We are consulting on our proposals, so this information could change
This page:
- Summarises proposals we are consulting on now – we will update this information when documents have been finalised;
- Is only meant to introduce you to the children’s safety duties – our draft Children’s Online Safety Codes set out our proposed safety measures in full.
What you can do now
We have created an easy tool to check if the Online Safety Act is likely to apply to your business. You can also register for email updates and we'll send you the latest information about how we regulate. This includes any important changes to what you need to do. You'll also be the first to know about our new publications, research and our webinars on online safety.
All services need to manage risks and protect children from encountering harmful content
Services that are likely to be accessed by children are required to carry out a children’s risk assessment and use proportionate safety measures to keep children safe online. This means that all such services will need to effectively mitigate and manage the risk of harm to children in different age groups (as identified in a children’s risk assessment), as well as mitigate the impact of harm to children from harmful content.
Further, if you have a user-to-user service that is likely to be accessed by children, it means you will need to take proportionate measures to effectively:
- Prevent children of any age from encountering pornography, suicide, self-harm, and eating disorder content (Primary Priority Content)
- Protect children in age groups judged to be at risk of harm from other harmful content including but not limited to bullying content, content which depicts serious violence or challenges and stunts (Priority Content) from encountering it.
If you have a search service that is likely to be accessed by children, it means you will need to take proportionate measures to effectively:
- Minimise the risk of children of any age encountering the most harmful search content to children namely pornography, suicide, self-harm, and eating disorder content (Primary Priority Content),
- Minimise the risk of children in age groups judged to be at risk of harm from other harmful content including but not limited to bullying content, content which depicts serious violence or challenges and stunts (Priority Content) from encountering it.
- Our draft Children’s Safety Codes provide a set of safety measures that you could use to meet your duties under the Act. There is no single fix-all measure that services can take to protect children online. Safety measures need to work together to help create an overall safer experience for children.
You can decide to adopt alternative measures to those recommended in the Codes. However, you will need to be able to demonstrate that they offer the appropriate level of safety for children in meeting the relevant duties and keep a record.
What you can do now
Get familiar with the risk factors that increase the likelihood of children encountering different kinds of content harmful to children by reading our draft Children’s risk profiles within our Children’s Risk Assessment Guidance..
Our Children’s Safety Codes set out a range of proposed measures that apply to different services
We have proposed a set of safety measures that we consider will operate collectively to achieve safer experiences for children online. These cover three broad areas:
- robust governance and accountability – ensuring service providers have appropriate senior oversight and accountability for children’s safety online,
- safer platform design choices– making sure services understand their users’ age and keep children safe, including ensuring recommender systems and content moderation operate effectively to prevent harm to children, alongside highly effective age assurance.
- providing children with information, tools, and support - ensuring service providers provide clear and accessible information to children and carers, making sure reporting and complaints functions are easy-to-use, and giving children tools and support to help them stay safe online.
The Act is clear that the safety measures you need to put in place should be proportionate. We have not taken a one-size-fits-all approach so each proposed measure is recommended for specified services based on relevant criteria, which include:
- The type of service you provide (user-to-user or search);
- The outcome of your latest risk assessment, and the risk of harm posed to children;
- The size of your service (UK user base); and
- The functionalities and other characteristics of your service.
Some measures apply to all services
These include having someone accountable for compliance with the children’s safety duties, making sure reporting and complaints functions are easy to use, having content moderation systems and processes and ensuring your terms of service (or publicly available statements) are clear and accessible.
Additional measures apply to services that pose significant risks to children
When you complete your children’s risk assessment, we’ll ask you to decide if you’re low, medium, or high-risk for each kind of content harmful to children. This rating needs to be as accurate as possible, and we’ve provided draft guidance on how to do it.
If your service poses a high or medium risk to children, there are additional measures that may apply to you:
- Some measures target risks related to specific kinds of harmful content and related functionalities. These measures are usually recommended if you are medium or high risk for at least one relevant kind of content harmful to children. For example, these include enabling children to block other users or disable comments, which can mitigate the risk of specific harms such as bullying.
- Some age assurance measures are also based on the principal purpose of your service. This refers to the main activity or objective of your service and whether this is to host or disseminate content that is harmful to children.
- Other measures – including governance and content moderation measures – target risks related to any kind of content harmful to children. These are recommended if you are medium or high risk for two or more kinds of content harmful to children (‘multi-risk’ service);
Further measures apply to large services irrespective of their risk
Certain measures would apply to large services only, such as having an internal monitoring and assurance function. Some of the measures mentioned above for 'multi-risk' services are also recommended for all large services regardless of their risk, even those that are not ‘multi-risk’. This is because large services will tend to be more complex and therefore need additional steps to manage risk. As these services have a larger reach, any risks that do arise can impact many children.
In our draft Codes, we have proposed to define a service as ‘large’ where the number of monthly UK users exceeds 7 million, approximately equivalent to 10% of the UK population.
What you can do now
If you don’t know it already, calculate the number of monthly UK users for your service
If you have views on our proposals, please share them with us
You can read our draft Children’s online safety codes in full and respond to our consultation (ODT, 107.9 KB). If you have views on our proposals, we’d love to hear from you.