Almudena Lara, Ofcom’s Online Safety Policy Director, looks at how some of the measures set out in the Online Safety Act will help to keep children safer when they’re online.
Ofcom is now the regulator for online safety, after the Online Safety Act became law.
Ensuring that children in the UK can live a safer life online is the main driver behind the UK’s new laws, and so naturally, it’s our primary focus too.
Being online is now part of everyday life for the vast majority of children. Our own data shows that nearly all children (99%) now spend time online, and that nine in 10 children own a mobile phone by the time they reach the age of 11. It’s also clear that there’s a blurred boundary between the lives children lead online and the ‘real world’.
Our research also shows three-quarters of social media users aged between eight and 17 have their own account or profile on at least one of the large platforms. And despite most platforms having a minimum age of 13, six in 10 children aged 8 to 12 who use them are signed up with their own profile.
However, almost three-quarters of teenagers between 13 and 17 have encountered one or more potential harms online, and three in five secondary school-aged children have been contacted online in a way that potentially made them feel uncomfortable.
It’s clear that we need to work hard to address these risks and protect children when they’re online.
I joined Ofcom last year as Online Safety Policy Director, and children’s safety is my primary focus – drawing on my experience from roles at the Department of Education, Google and the NSPCC . I’m proud to be working alongside expert colleagues from across the worlds of technology, academia and the third sector, bringing a wealth of knowledge and experience to our shared mission.
How the Act will help
Under the Act, online services – such as social media apps, instant messaging platforms and gaming sites, as well as search engines and porn sites – are required in a range of different ways to take steps to better protect children online.
First of all, tech firms will need to establish whether children are likely to access their service, or part of their service. If this is the case, they must assess the risks of harm to children and take proportionate steps to manage and mitigate those risks.
For user-to-user services, for example, this includes taking specific action to prevent children from encountering pornographic content, and content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders. Some services will be required to use highly effective age-checks to restrict access to that type of content. Tech firms will also need to protect children from other types of harms, including bullying and violent content and forms of legal hate speech.
Companies must also use a range of measures to tackle illegal content online – including child sexual abuse material and grooming. Our first priority as online safety regulator was to release our draft Illegal Harms Codes of Practice for consultation, setting out the practical, targeted safety measures that services can take.
The Act also places specific requirements on sites and apps that display or publish pornographic content, so that children are not normally able to come across pornography on their service. They must do this by using highly effective age-checks, and we wasted no time in publishing draft guidance to help services to comply.
Our starting point is to work with the industry to help firms to understand their new duties to keep users, particularly children, safe. Crucially, if online services don’t comply, we’ll have powers to take enforcement action against them. This includes imposing fines of up to £18m or 10% of their worldwide revenue (whichever is greater).
Tips to help keep kids safe online
While we’ll be holding tech firms to account in ensuring their services are safer for children, there are also things parents and carers can do to help support their children’s online safety.