
Children in the UK will be able to live a safer life online under new Ofcom protections which set ambitious new safety standards for tech firms.
As the UK’s online safety regulator, protecting children is a priority for Ofcom. We’re determined to make the online world a better place for them – and to give parents peace of mind that their children will be better protected from the risks they might face.
Spending time online is a part of everyday life for everyone in the UK – including children. We appreciate the value that children and families derive from using online services, and we have spoken to many children and parents in coming up with the expectations we’re placing on firms.
We want children to be able to enjoy the benefits of being online, while reassuring parents that services like websites, social media platforms, games and apps will be more responsible than ever for making sure online spaces are safer for the children who use them.
That’s why we’ve set out these new measures that online services must take to improve their safety measures, especially to protect children. That means preventing children from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. Sites and apps must also act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.
These are just the latest steps in our online safety work, and there’s more to come. We appreciate that these measures won’t solve all of the problems of the online world, but we’re confident that they will help to bring about a major step forward in online safety for children in the UK. That’s a promise we make to parents.
How will your child be kept safer?
Our rules include a number of measures that will help to improve safety for everybody under the age of 18 when they’re online.
- Effective age-checks. Platforms must put in place robust age checks to prevent children from accessing harmful content, including pornography, self-harm, eating disorders and suicide content.
- Safer feeds. Algorithms mustn’t recommend harmful content to children in their feeds.
- Fast action. All sites and apps must have processes in place to review, assess and quickly tackle harmful content when they become aware of it.
- More choice and support. Sites and apps must give children more control over their online experience. This includes allowing them to indicate what content they don’t like, to accept or decline group chat invitations, to block and mute accounts and to disable comments on their own posts. There must be supportive information for children who may have encountered, or have searched for harmful content.
- Easier reporting and complaints. It must be straightforward for children to report content or complain to a platform, and providers should respond with appropriate action. Terms of service must also be clear so children can understand them.
- More responsibility. All services must have a named person who is responsible for children’s safety, and a senior body should annually review the safety measures they have in place.
These measures build on other rules in place to protect users, including children, from illegal online harms – such as grooming.
Crucially, it doesn’t matter where a company is based: if a site or app can be accessed by children in the UK, the laws apply – and we’re ready to enforce them.
What you can do as a parent
We appreciate that you might be concerned about your children’s online safety. And we also know that most children have encountered harmful content and activity online. It can have serious impacts on their physical and mental wellbeing.
While it is the responsibility of tech firms to keep children safe – and we’re holding them to account to make sure they do – here are some top tips to help parents manage the risks to their children online:
- Talk regularly to your children about what they do when they’re online. Encourage them to tell you if they’ve seen something online that they think is harmful.
- Make sure your children register with online services using their real age, to help prevent them from accessing content that is only suitable for people who are older.
- Make sure they know how to report inappropriate or harmful content, how to block accounts that share it and encourage them not to share it themselves. If content is illegal, report it to the police. Be aware that Ofcom cannot deal with individual complaints about online content.
- Parental controls are a useful tool, helping you to monitor and limit the time your children spend online, and what they do when they’re there.
There’s also a range of helpful resources available from different organisations:
- The Safer Internet Centre – the organisation behind Safer Internet Day, which Ofcom supports – has information aimed at helping you keep your child safe online.
- The NSPCC also has an online safety hub with advice for parents and children, including tips on how to talk to your children about online safety.
- Internet Matters offers information and advice to parents and carers to help their children navigate the digital world – including a digital toolkit you can tailor for your family’s needs.
What is the Online Safety Act?
Today’s measures are part of the wider online safety rules. The Online Safety Act is a new set of laws that protects everyone from illegal content online, and protects children from content that is harmful.
It makes tech firms legally responsible for their users’ safety. And it gives Ofcom powers as online safety regulator to enforce these laws – including taking action against firms that don’t comply with their online safety duties.
All sites and apps that children could use – regardless of whether they are accessed on a smartphone, tablet, computer or game platform – will have to follow our new rules, so that children have safer experiences online.
Does the Online Safety Act prevent children from using social media?
The act doesn’t ban children from social media, nor does it set a minimum age for them using it. It does say that social media companies must consistently enforce their age limits and protect their child users – but parents should be aware that some popular sites and apps have no minimum age requirements for their users.
And while we appreciate there is has been lots of debate around whether - or when - children should be able to own smartphones, the act doesn’t cover this so Ofcom has no legal powers in this area.
But the changes sites and apps will have to make as a result of our rules, will make sure children of all ages are safer online.