Close up of a girl on a bed using a mobile phone

Just one in six young people flag harmful content online

Published: 27 June 2022
Last updated: 16 March 2023
  • Ofcom joins forces with influencer Lewis Leigh for new social media campaign to increase reporting among young people

Two thirds of teens and young adults have recently encountered at least one potentially harmful piece of content online, but only around one in six go on to report it, Ofcom has found.

The findings come as the Government’s Online Safety Bill continues to make its way through Parliament[1]. Ofcom will enforce these new laws, and has already started regulating video sharing platforms established in the UK – such as TikTok, Snapchat and Twitch.

To help galvanise more young internet users to report potentially harmful content, Ofcom has joined forces with social media influencer Lewis Leigh, and behavioural psychologist – Jo Hemmings, to launch a new campaign. The social media campaign aims to reach young people on the sites and apps they use regularly, to highlight the importance of reporting posts they may find harmful.

Ofcom research reveals reporting concerns

Ofcom’s Online Experiences Tracker reveals that most younger people aged between 13 and 24 (65%) believe the overall benefits of being online outweigh the risks. But around the same proportion – 67% – have encountered potentially harmful content.

Younger people told us that the most common potential harms they came across online were: offensive or ‘bad’ language (28%); misinformation (23%); scams, fraud and phishing (22%); unwelcome friend or follow requests (21%) and trolling (17%).

A significant number of young people (14%) also encountered bullying, abusive behaviour and threats; violent content; and hateful, offensive or discriminatory content, targeted at a group or individual based on their specific characteristics.

But our research reveals a worrying gap between the 67% of young people who experience harm online and those who flag or report it to the services. Fewer than one in five young people (17%) take action to report potentially harmful content when they see it.

Younger participants say the main reason for not reporting is that they didn’t see the need to do anything (29%); while one in five (21%) do not think it will make a difference. Over one in ten (12%) say they don’t know what to do, or whom to inform.

User reporting is one important way to ensure more people are protected from harm online. For example, TikTok’s transparency report[2] shows that of the 85.8 million pieces of content removed in Q4 2021, nearly 5% were removed as a result of users reporting or flagging content. In the same period, Instagram reported 43.8 million content removals, of which about 6.6% were removed as a result of users reporting or flagging content.

With young people spending so much of their time online, the exposure to harmful content can unknowingly desensitise them to its hurtful impact. People react very differently when they see something harmful in real life – reporting it to the police or asking for help from a friend, parent or guardian – but often take very little action when they see the same thing in the virtual world.

What is clear from the research is that while a potential harm experienced just once may have little negative impact, when experienced time and time again, these experiences can cause significant damage. Worryingly, nearly a third of 13-to-17 year olds didn’t report potentially harmful content because they didn’t consider it bad enough to do something about. This risks a potentially serious issue going unchallenged.

That is why I’m working with Ofcom to help encourage people to think about the content they or their children are being exposed to online, and report it when they do, so that the online world can be a safer space for everyone.

Jo Hemmings, behavioural and media psychologist

Ofcom’s new campaign, which launches today, aims to help address this lack of reporting. It will feature TikTok influencer Lewis Leigh, who rose to fame during lockdown with his viral TikTok videos showing him teaching his nan, Phyllis, dance moves.

The campaign aims to show young people that, by taking a moment to stop, think and flag problematic content – rather than scrolling past – they can really make an important difference in helping to keep their online communities safer.

My generation has grown up using social media and it’s how I make a living. So although it’s mainly a positive experience and a place to bring people together and build communities, harmful content is something I come across all the time too.

That’s why it was important to team up with my lovely Nan for this campaign to raise awareness of what we can do to protect each other online. Let’s face it, our nans are pretty much the best judges out there, and always give the best advice. So next time you’re scrolling through your phone and come across something you’re not quite sure about, ask yourself, ‘What would my nan think?’  If it’s a ‘no’ from nan then perhaps think about reporting it – because that really can make a difference.

Lewis Leigh, TikTok sensation

As we prepare to take on our new role as online safety regulator, we’re already working with video sites and apps to make sure they’re taking steps to protect their users from harmful content. Our campaign is designed to empower young people to report harmful content when they see it, and we stand ready to hold tech firms to account on how effectively they respond.

Anna-Sophie Harling, Online Safety Principal at Ofcom

As part of its forthcoming role regulating online safety, Ofcom will have a range of powers to require change. We will be working with companies to help them understand their new obligations and what steps they need to take to protect their users from harm. This autumn we will publish our first annual report on how effectively video-sharing platforms are tackling harm on their services.

Find out more from Ofcom’s Online Nation 2022 report, and see the social campaign on Lewis’ TikTok and Instagram profiles.

For more information, please get in contact with the team at OfcomVSP@3monkeyszeno.com.

Notes to editors

Unless specified in the end notes, all research results are from Ofcom’s Online Nation report 2022: UK representative sample of 6,640 online users ages 13-85.

  1. https://www.gov.uk/government/publications/online-safety-bill-supporting-documents/online-safety-bill-factsheet
  2. TikTok, TikTok Community Guidelines Enforcement, Q3 2021
Back to top