- Most UK adult sites do not have robust measures in place to prevent children accessing pornography
- The biggest – OnlyFans – has introduced new age verification in response to regulation
- Ofcom expects urgent action from other companies to protect children
Smaller adult video-sharing sites based in the UK do not have sufficiently robust access control measures in place to stop children accessing pornography, Ofcom has found in a new report published today.
Ahead of our future duties in the Online Safety Bill, Ofcom already has some powers to regulate video-sharing platforms (VSPs) established in the UK, which are required by law to take measures to protect people using their sites and apps from harmful videos.
Nineteen companies have notified us that they fall within our jurisdiction. They include TikTok, Snapchat, Twitch, Vimeo, OnlyFans and BitChute; as well as several smaller platforms, including adult sites.
We have used our powers to gather information from platforms on what they are doing to protect users from harm online. Ofcom is one of the first regulators in Europe to do this – today’s report is the first of its kind under these laws and reveals information previously unpublished by these companies.
What we have found
Ofcom is concerned that smaller UK-based adult sites do not have robust measures in place to prevent children accessing pornography. They all have age verification measures in place when users sign up to post content. However, users can generally access adult content just by self-declaring that they are over 18.[1]
One smaller adult platform told us that it had considered implementing age verification, but had decided not to as it would reduce the profitability of the business.
However, the largest UK-based site with adult content, OnlyFans, has responded to regulation by adopting age verification for all new UK subscribers, using third-party tools provided by Yoti and Ondato.
According to new research we have published today, most people (81%) do not mind proving their age online in general, with a majority (78%) expecting to have to do so for certain online activities. A similar proportion (80%) feel internet users should be required to verify their age when accessing pornography online, especially on dedicated adult sites.
Over the next year, adult sites that we already regulate must have in place a clear roadmap to implementing robust age verification measures. If they don’t, they could face enforcement action. Under future online safety laws, Ofcom will have broader powers to ensure that many more services are protecting children from adult content.
Some progress protecting users, but more to be done
We have seen some companies make positive changes more broadly to protect users from harmful content online, including as a direct result of being regulated under the existing laws. For example:
- TikTok now categorises content that may be unsuitable for younger users, to prevent them from viewing it. It has also established an Online Safety Oversight Committee, which provides executive oversight of content and safety compliance specifically within the UK and EU.
- Snapchat recently launched a parental control feature, Family Center, which allows parents and guardians to view a list of their child’s conversations without seeing the content of the message.
- Vimeo now allows only material rated ‘all audiences’ to be visible to users without an account. Content rated ‘mature’ or ‘unrated’ is now automatically put behind the login screen.
- BitChute has updated its terms and conditions and increased the number of people overseeing and – if necessary – removing content.
However, it is clear that many platforms are not sufficiently equipped, prepared and resourced for regulation. We have recently opened a formal investigation into one firm, Tapnet Ltd – which operates adult site RevealMe – in relation to its response to our information request.
We also found that companies are not prioritising risk assessments of their platforms, which we consider fundamental to proactively identifying and mitigating risks to users. This will be a requirement on all regulated services under future online safety laws.
Today’s report is a world first. We’ve used our powers to lift the lid on what UK video sites are doing to look after the people who use them. It shows that regulation can make a difference, as some companies have responded by introducing new safety measures, including age verification and parental controls.
But we’ve also exposed the gaps across the industry, and we now know just how much they need to do. It’s deeply concerning to see yet more examples of platforms putting profits before child safety. We have put UK adult sites on notice to set out what they will do to prevent children accessing them.
Dame Melanie Dawes, Ofcom's Chief Executive
Unlike in our broadcasting work, our role is not to assess individual videos. Due to the massive volume of online content, it is impossible to prevent every instance of harmful content. So Ofcom’s job is to make sure the platforms are taking effective action to address harmful content.
Over the next twelve months, we expect companies to set and enforce effective terms and conditions for their users, and quickly remove or restrict harmful content when they become aware of it. We will review the tools provided by platforms to their users for controlling their experience, and expect them to set out clear plans for protecting children from the most harmful online content, including pornography.
The Online Safety Bill
The forthcoming online safety laws will give Ofcom wider duties and powers, and many more services will be required to protect their users.
We will move quickly once the Bill passes to put these laws into practice. Tech firms must be ready to meet our deadlines and comply with their new duties. That work should start now, and companies need not wait for the new laws to make their sites and apps safer for users.
In particular, we are encouraging all companies likely to be in scope to review how they assess risks to their users, explore opportunities for improvement, integrate trust and safety across product and engineering teams and staff up now to be ready for UK online safety laws.
Notes to editors
- Subscriber sign-on process for smaller UK-established adult video-sharing platforms:
- Snapshot: The six biggest notified UK-established video-sharing platforms: