Open
Duties under the Online Safety Act 2023 (‘the Act’) to protect users from encountering, and prevent offenders sharing, child sexual abuse material (‘CSAM’) on file-sharing and file-storage services.
17 March 2025
Ofcom has opened a programme of work, or ‘enforcement programme’, to assess the measures being taken by providers of file-sharing and file-storage services that present particular risks of harm to UK users from image-based CSAM to ensure users do not encounter, and offenders are not able to disseminate, such content on their services.
Relevant Legal Provision: Sections 10(2) and 10(3) of the Online Safety Act 2023.
Part 3 of the Act imposes duties on providers of regulated user-to-user services to use proportionate: (a) measures to prevent individuals from encountering priority illegal content – including CSAM – by means of the service; (b) measures to effectively mitigate and manage the risk of the service being used for priority offences or harm to individuals arising on the service as identified in the most recent ‘Illegal Content Risk Assessment’; and (c) systems and processes designed to minimise the length of time for which any priority illegal content is present and swiftly take it down when they are made aware of its presence. These ‘Illegal Content Duties’ came into effect on 17 March 2025. Before then, providers of all services in scope of the Act are required to have conducted an Illegal Content Risk Assessment.
Regulated user-to-user service providers can comply with the Illegal Content Duties by implementing measures recommended in Ofcom’s illegal harms Codes of Practice for user-to-user services issued on 24 February 2025 (the ‘Codes of Practice’), or through alternative measures.
As explained in Ofcom’s Statement accompanying the Codes of Practice published on 16 December 2024, CSAM is so prevalent on many services that human content moderation alone cannot identify and remove it at sufficient speed and scale. Accordingly, the Codes of Practice recommend certain services use automated moderation technology (including perceptual hash-matching – see explainer below) to assess whether content communicated publicly is CSAM and, if so, swiftly take it down.
Evidence shows that file-sharing and file-storage services are particularly susceptible to being used for the sharing of CSAM. Due to the level of harm arising from this, our Codes of Practice recommend that all file-sharing and file-storage service providers – irrespective of size – implement perceptual hash-matching where their Illegal Content Risk Assessment accurately identifies them as high risk of hosting image-based CSAM.
Ofcom has opened an enforcement programme to assess the measures being taken by providers of file-sharing and file-storage services that present particular risks of harm to UK users to meet their Illegal Content Duties in respect of image-based CSAM on their services.
Action we are taking:
To tackle the dissemination of CSAM on the highest-risk services, we have already done the following:
- Over recent months, our taskforce dedicated to driving compliance among small but risky services has identified and engaged with providers of certain smaller file-sharing and file-storage services to discuss compliance with the Illegal Content Duties and assess which services may already be taking appropriate measures. This is in addition to existing and ongoing engagement with the largest file-sharing and file-storage services about their obligations under the Act.
- We have worked with law enforcement agencies and other organisations – including the Internet Watch Foundation (IWF), the Canadian Centre for Child Protection (C3P) and the National Centre for Missing and Exploited Children (NCMEC) – to identify file-sharing and file-storage services at highest risk of hosting image-based CSAM.
- Today, Ofcom has written to providers of services that present particular risks of harm to UK users from this content to advise them of their duties under the Act and put them on notice that we will shortly be sending them formal requests for information:
- to assess whether they are in-scope of the Act; and
- if so, the measures they have in place, and/or will soon have in place, to identify, assess and remove known image-based CSAM; and
- to provide a record of their Illegal Content Risk Assessments.
- Ofcom has also sent advisory letters to a number of other providers of file-sharing and file-storage services to inform them of their duties under the Act. This is the start of further engagement with these services.
As part of the Programme, we will:
- assess the measures being taken by providers of file-sharing and file-storage services that present particular risks of harm to UK users to comply with the Illegal Content Duties in respect of image-based CSAM;
- where potential non-compliance is identified, determine whether formal enforcement action may be appropriate in respect of the Illegal Content Duties, Illegal Content Risk Assessment duties and/or formal information request duties;
- continue engagement with providers of other file-sharing and file-storage services to better understand their approaches to detecting image-based CSAM and compliance with the Illegal Content Duties; and
- continue to work with law enforcement agencies and other organisations to target compliance by the highest risk services.
Hash-matching is a process that detects content previously identified as illegal or violative. ‘Hashing’ is an umbrella term for techniques used to create fingerprints of content, which can then be stored in a database. These can be used by providers, who generate hashes of the content on their service and compare those against the hashes in the database to test whether any uploaded content is a ‘match’ for those images.
There are several types of hash-matching. Ofcom’s Codes of Practice recommend the use of ‘perceptual’ over ‘cryptographic’ hash-matching, to allow for more harmful content to be identified and potentially moderated (see Code Measure ICU C9). Perceptual hash-matching aims to identify images that are similar to images of known CSAM, whereas cryptographic hash-matching identifies identical images. In practice, perceptual hash-matching is therefore more likely to detect a larger amount of CSAM compared to other forms of hash-matching.