
- Ofcom sets out practical steps for online services to tackle misogyny, pile-ons, online domestic abuse and other harms
- Clear expectation for sites and apps to take responsibility, prevent harm and support women and girls, above and beyond new legal duties
Ofcom has today proposed concrete measures that tech firms should take to tackle online harms against women and girls, setting a new and ambitious standard for their online safety.
With insights from victims, survivors, women’s advocacy groups and safety experts,[1] today’s draft guidance sets out practical, ambitious but achievable measures that providers can implement to improve women’s and girls’ safety. It focuses on four issues:
- Online misogyny – content that actively encourages or cements misogynistic ideas or behaviours, including through the normalisation of sexual violence.
- Pile-ons and online harassment – when a woman or groups of women are targeted with abuse and threats of violence. Women in public life, including journalists and politicians, are often affected.
- Online domestic abuse – the use of technology for coercive and controlling behaviour within an intimate relationship.
- Intimate image abuse – the non-consensual sharing of intimate images – including those created with AI; as well as cyberflashing – sending explicit images to someone without their consent.
Our guidance identifies a total of nine areas where technology firms should do more to improve women and girls’ online safety by taking responsibility, designing their services to prevent harm and supporting their users.[2]
It promotes a safety-by-design approach, demonstrating how providers can embed the concerns of women and girls throughout the operation and design of their services, as well as their features and functionalities. To illustrate the specific changes that providers can make to improve women and girls’ safety, we are including practical examples of good industry practice, such as:
- ‘Abusability’ testing to identify how a service or feature could be exploited by a malicious user;
- Technology to prevent intimate image abuse, such as identifying and removing non-consensual images based on databases;
- User prompts asking them to reconsider before posting harmful material – including detected misogyny, nudity or content depicting illegal gendered abuse and violence;
- Easier account controls, such as bundling default settings to make it easier for women experiencing pile-ons to protect their accounts;
- Visibility settings, allowing users to delete or change the visibility of their content, including material they uploaded in the past;
- Strengthening account security, for example using more authentication steps, making it harder for perpetrators to monitor accounts without the owner’s consent;
- Removing geolocation by default, because this information leaking can lead to serious harms, stalking or threats to life;
- Training moderation teams to deal with online domestic abuse;
- Reporting tools that are accessible and support users who experience harm;
- User surveys to better understand people’s preferences and experiences of risk, and how best to support them; and
- More transparency, including publishing information about the prevalence of different forms of harms, user reporting and outcomes.
Why this matters
The online world can be a hostile and dangerous place for women and girls. Online spaces can facilitate online domestic abuse, silence women who wish to express themselves, create communities where misogynistic views thrive, and sometimes affect women’s ability to do their jobs. Women report more harm and greater concerns about the internet than men.[3]
Under the UK’s online safety laws, services such as social media, gaming, dating apps, discussion forums and search engines have new responsibilities to protect people in the UK from illegal content, and children from harmful content – including harms that disproportionately affect women and girls.
This means companies must assess the risk of gender-based illegal harms, such as controlling or coercive behaviour, stalking and harassment, and intimate image abuse on their services. They must then take action to protect users from this material, including by taking it down once they become aware of it. Sites and apps must also protect children from harmful material, such as abusive, hateful, violent and pornographic content.
To help services meet these duties, Ofcom has already published final Codes and guidance on how we expect tech firms to tackle illegal content, and we’ll shortly publish our final Codes and guidance on the protection of children. Once these duties come into force, Ofcom’s role will be to hold tech companies to account, using the full force of our enforcement powers, whenever and wherever necessary.
But beyond enforcing these core legal duties, the Act also requires Ofcom to produce additional, dedicated industry guidance setting out how providers can take action against harmful content and activity that disproportionately affects women and girls, in recognition of the unique risks they face.
What happens now
We are now inviting feedback on our draft Guidance, as well as further evidence on any additional measures that could be included to address harms that disproportionately affect women and girls. Responses must be submitted by 23 May 2025. Once we have examined all responses, we will publish a statement with our decisions, along with final guidance, later this year.
We also expect technology firms regularly to assess new or emerging threats, and we will report on how well they have tackled harms to women and girls around 18 months after our final guidance comes into effect.
Dame Melanie Dawes, Ofcom Chief Executive said:
No woman should have to think twice before expressing herself online, worry about an abuser tracking her location, or face the trauma of a deepfake intimate image of herself being shared without her consent.
Yet these are some of the very real online risks that women and girls face today - and many tech companies are failing to act.
Our practical guidance is a call to action for online services - setting a new and ambitious standard for women and girls’ online safety. There’s not only a moral imperative for tech firms to protect the interests of female users, but it also makes sound commercial sense – fostering greater trust and engagement with a significant proportion of their customer base.
Cally Jane Beech, campaigner and influencer, who experienced deepfake intimate image abuse said:
I want things to be better, for my daughter, and for women and girls all over the UK. We should all be in control of our own online experience so we can enjoy the good things about it. Tech companies need to be made more accountable for things being hosted on their sites.
Domestic Abuse Commissioner, Dame Nicole Jacobs, said:
Everyone should be free to live out their lives online without the fear that they will be abused, stalked or harassed. But far too often, victims and survivors are expected to keep themselves safe from online abuse, rather than tech companies taking steps to protect their users.
I’m pleased that Ofcom are stepping up to start the process of providing guidance to tech companies on how to tackle this. It’s now on these firms to implement these recommendations and ensure that perpetrators can no longer weaponise online platforms for harm. By taking meaningful practical action, not only will people be safer online, but it will demonstrate that tech companies are ready to play their part in tackling domestic abuse.
ENDS
Notes to editors
- In developing the draft Guidance, we ran stakeholder workshops to support our interpretation of the available evidence and expand our evidence base further. This provided an opportunity to bring together experts on online gender-based harms across the UK and beyond to ensure that we could consider these perspectives early in the process of developing the draft Guidance. Over 30 organisations participated, including civil society, academics, service providers, and public bodies. We also had representatives present from across all four nations in the UK, as well as international stakeholders.
- Nine areas where technology firms should do more to improve women and girls’ online safety:
- Women are less likely than men to think that the benefits of the online world outweigh the risks (65% vs 70%) and are less likely to believe that the internet is a good thing for society (34% vs. 47%). Women and teenage girls are also more likely than men and teenage boys to report being negatively impacted by the harms they experience online (24% and 29% vs 11% and 19% respectively) (24% vs 11%). The same pattern is seen when comparing adult women (29%) and adult men (19%). Ofcom Online Experiences tracker.