The Online Safety Bill recently became the Online Safety Act, meaning it is now law.
This means Ofcom is now formally the regulator for online safety, with a responsibility to help make online services safer for the people who use them.
Services that fall under our remit will have to follow certain rules, including protecting users from illegal content and activity online, as well as protecting children from harmful content. Examples of illegal content include child sexual abuse material, terrorism, fraud, selling illegal drugs or weapons, and content encouraging self harm or suicide.
These services include:
- services that host user-generated content (such as social media);
- search engines;
- services that host pornographic content;
- messaging services; and
- UK-established video-sharing platforms (VSPs).
The rules are different for each of these different types of service. Some services might have other duties to meet, so that:
- people have more choice and control over what they see online;
- services are more transparent and can be held to account for their actions; and
- services protect freedom of expression.
What is Ofcom’s role in online safety?
Put simply, our role is to make sure regulated services take appropriate steps to protect their users.
We’re not responsible for removing online content, and we won’t require companies to remove content, or particular accounts. Our job is to help build a safer life online by making sure firms have effective systems in place to prevent harm and protect the people using their services.
We will have a range of tools to make sure services follow the rules – including setting out codes of practice and guidance for companies falling under the scope of the new legislation. We’re now consulting on these, and the new rules will come into force once the codes and guidance are approved by Parliament.
Under these new rules, we will have powers to take enforcement action, including issuing fines to services if they fail to comply with their duties. Our powers are not limited to service providers based in the UK.
Will Ofcom be investigating content that people post online?
Unlike our work in regulating content broadcast on TV and radio, Ofcom’s role in online safety isn’t about deciding whether particular posts or other content should or shouldn’t be available, or whether it complies with specific standards. Instead, our role is to make sure social media sites and other regulated online services have appropriate systems and processes in place to protect their users.
Importantly, in our role as online safety regulator, we will always take into account people’s rights, including freedom of expression and privacy.
What to do if you’ve seen something harmful online
If you’ve seen something online that you feel is harmful, complain first to the online service on which you saw it, not to Ofcom. Do not share the content, as this will only help it to be seen by others.
If you’ve reported content and are concerned action hasn’t been taken by the online service, you can complain to Ofcom through our online complaints portal.
Please note – you can only complain to us about online services that we regulate.
We are unable to respond to or investigate individual complaints. However, we do use complaints to help us assess whether regulated services are appropriately protecting their users and if we should take any action.
There are resources and support services that can help you if you have seen illegal, harmful or upsetting content online.
Further information
For more detail on everything mentioned here, visit the online safety section of our website.