Smartphone screen showing TikTok and other social media app icons

How are TikTok, Snap and Twitch protecting children from harmful videos?

Published: 14 December 2023
Last updated: 21 March 2025

A new report from Ofcom, published today, takes stock of how popular video-sharing platforms are protecting children from accessing potentially harmful videos.

Under the video-sharing platform (VSP) regime, UK-based services must put in place measures to protect children from encountering videos that may impair their physical, mental, or moral development.

Using our formal information gathering powers, we’ve looked at the steps being taken by TikTok, Snap and Twitch – three of the most popular regulated video sharing services for children – to meet these requirements. We found that all three take steps to prevent children encountering harmful videos, however, children can still sometimes face harm while using these platforms.

What we found

Our report finds that:

  • TikTok, Twitch and Snap all allow sign-ups from children aged 13 and over. They rely on users declaring their true age when signing up. This means it is easy for users to gain access by entering a false age.
  • All three enforce age restrictions using a range of methods to identify potential underage accounts, including artificial intelligence (AI) technologies and human moderators. However, the effectiveness of these measures is yet to be established. The report includes the number of underage accounts taken down by each platform.
  • Users need an account to access most of Snap’s or TikTok’s content. Twitch, however, is open access, which means anyone of any age can access most of its videos, regardless of whether they have an account. This includes any video where a mature label has been applied.
  • The three platforms adopt different approaches to classifying and labelling content as unsuitable for under-18s. TikTok classifies content based on certain mature themes, Snap ranks content on Discover and Spotlight to make sure it is age appropriate, and Twitch has introduced more detailed content labelling. Without robust corresponding access controls and safety measures, however, children still risk encountering harmful content. For example, all Twitch users – logged in or not – can view age-inappropriate content by simply dismissing the warning label.
  • TikTok and Snap both have parental controls designed to give parents and carers some oversight of their children’s online activity. By contrast, Twitch’s terms and conditions require parents to supervise children in real time while they are using the service.

The protection of children – including ensuring that under-18s have an age-appropriate online experience – is central to the Online Safety Act. In line with our implementation roadmap, we will be consulting on the broad child safety measures under the Act in spring 2024.

We expect all services regulated under the VSP regime to also be in scope of the online safety regime. However, only once it is fully repealed by the UK Government will services have to comply with all their broader Online Safety Act duties.

In the meantime, we will continue to work with regulated VSPs to drive safety improvements in the interests of their users. This will include dedicated supervisory engagement, further transparency reporting, or – where appropriate – taking enforcement action.

New investigation into TikTok’s compliance with a statutory information request

It is crucial that Ofcom can gather accurate information about measures put in place by regulated VSPs to protect users. This includes understanding systems, such as parental controls, to help ensure that children are protected from restricted material.

We use such information to monitor the measures taken by platforms, assess compliance, and publish public reports.

We asked TikTok for information about its parental control system, Family Pairing, and we have reason to believe that the information it provided was inaccurate.

Update 21 March 2025:

On 14 December 2023, we opened an investigation into whether TikTok had failed to comply with its duties to provide information in response to a formal request for information, in such a manner as specified by Ofcom.

A Final Decision has now been reached. Details of the investigation outcome can also be found here.

As part of the investigation, and for the purposes of the protection of children report, TikTok provided an updated set of data, about the parental control systems it has in place. TikTok told us that in February 2024, the number of UK teen users (users recorded as being of between 13 and 17 years old on its system) with Family Pairing activated represented between 4-5% of TikTok’s total UK teen monthly active users.

TikTok provided a new set of data that relates to February 2024 as a result of the updated dataset having been sourced from a data source with different retention policies. Whilst our transparency report was prepared using data from the period between 1 April 2023 and 30 June 2023. The updated data provided by TikTok during the investigation refers to the period between 1 February 2024 and 29 February 2024.

As outlined in the report -

“Only platforms with the relevant connection to the UK are regulated under the UK VSP Regime. Many other VSPs – including YouTube and Instagram – do not fall under the UK regime because they are based elsewhere. So, our findings do not represent the entire VSP sector, nor do they indicate that TikTok, Twitch, and Snap are better or worse than the platforms not in scope of the UK VSP Regime. But our report does illustrate the approaches platforms have taken, and the challenges they’ve faced ahead of the broader online safety regime.” (see page 3).

What to do if we request information from you

To inform our work as a regulator, we sometimes use our formal powers to request information from individuals and businesses. When we do, we expect to receive clear, complete and accurate information.

If you get a request from us, here's what you need to do.

Back to top