We are consulting on draft guidance, which sets out nine areas where technology firms should do more to improve women and girls’ online safety by taking responsibility, designing their services to prevent harm and supporting their users.
The Online Safety Act 2023 makes platforms – including social media, gaming services, dating apps, discussion forums and search services – legally responsible for protecting people in the UK from illegal content and content harmful to children, including harms that disproportionately affect women and girls.
Ofcom has already published final Codes and risk assessment guidance on how we expect platforms to tackle illegal content, and we’ll shortly publish our final Codes and guidance on the protection of children. Once these duties come into force, Ofcom’s role will be to hold tech companies to account, using the full force of our enforcement powers where necessary.
But Ofcom is also required to produce guidance setting out how providers can take action against harmful content and activity that disproportionately affects women and girls, in recognition of the unique risks they face.
Our draft Guidance identifies a total of nine areas where technology firms should do more to improve women and girls’ online safety by taking responsibility, designing their services to prevent harm and supporting their users.
Responding to this consultation:
Please submit responses by using consultation-response-form (ODT, 98.85 KB) form by 5pm on 23 May 2025.
If you have been affected by these harms, you can find support services here (Domestic Abuse Commissioner) and here (Victim and Witness Information) . If you’re worried someone might share your intimate images online or it has already happened to you, see StopNCII and the Revenge Porn Helpline.
How to respond
Ofcom Online Safety Group,
Ofcom,
Riverside House,
2A Southwark Bridge Road,
London SE1 9HA