Abstract Online (Web)

Tackling online content that stirs up hatred, provokes violence and spreads disinformation

Published: 5 August 2024

Tackling illegal content online is a major priority for Ofcom. In recent days, we have seen appalling acts of violence in the UK, with questions raised about the role of social media in this context.

The UK’s Online Safety Act will put new duties on tech firms to protect their users from illegal content, which under the Act can include content involving hatred, disorder, provoking violence or certain instances of disinformation.

What will tech firms have to do?

When the new duties are in force, later this year, tech firms will have three months to assess the risk of illegal content on their platforms, and will then be required to take appropriate steps to stop it appearing, and act quickly to remove it when they become aware of it. The largest tech firms will in due course need to go even further – by consistently applying their terms of service, which often include banning things like hate speech, inciting violence, and harmful disinformation.

If tech firms don’t comply, we will have a broad range of enforcement powers at our disposal. These include the power to impose significant financial penalties for breaches of the safety duties. The regime focuses on platforms’ systems and processes rather than the content itself that is on their platforms. So Ofcom’s role will not involve us making decisions about individual posts or accounts, or requiring specific pieces of content to be taken down.

As part of our wider engagement with tech platforms we are already working to understand what actions they are taking in preparation for these new rules. Platforms can act now – there is no need to wait for the new laws to come into force before sites and apps are made safer for users.

When will the Act come into force?

We’re working to implement the Act so we can enforce it as soon as possible. To do this, we are required to consult on codes of practice and guidance, after which the new safety duties on platforms will become enforceable. We expect the first set of duties – regarding illegal content – will come into force from around the end of this year.

Back to top