Ofcom is the regulator for the communications services that we use and rely on each day.
As people communicate seamlessly online and offline, we now need to invest our efforts into making digital communications work for everyone
Ofcom wants to understand how adults and children in the UK use media.
Under the Online Safety Act, Ofcom's job is to make online services safer for the people who use them. We make sure companies have effective systems in place to protect users from harm.
Ofcom is committed to a thriving telecoms sector, where companies can compete fairly and customers benefit from a broad range of services.
Ofcom's job is to make sure there is a universal postal service.
You can't see or feel radio spectrum, but we use it every day. Our job is to authorise and manage the use of spectrum in the UK.
We make sure that broadcasters provide quality TV, radio and on-demand programmes that appeal to diverse audiences. We also have rules in place to protect viewers and listeners from harm.
How to make the most of communications services as a small business.
How to make the most of the services you use, and deal with problems.
Proposals we are consulting on and decisions we've made.
How we make sure companies follow our rules, to protect customers and promote competition.
Rules, guidance and other information for the industries we regulate.
If you're looking to use certain radio equipment, or broadcast on TV or radio, you'll need a licence from Ofcom.
Our latest news, features, views and information about our work.
Evidence we gather to inform our work as a regulator.
HELP US MAKE OFCOM’S WEBSITE BETTER!
Share your experience in our 2-minute survey (opens in a new window)
Showing 1 - 27 of 44
Published: 12 November 2024
Social media is a significant aspect of daily life for most of us. Almost all adult internet users use some form of online communication platform.
Published: 8 October 2024
The new Online Safety Act (OSA) and existing VSP regulation place duties on relevant online services to protect their users from illegal content and children from certain harmful content.
Published: 18 September 2024
We wanted to find out if behavioural techniques could be used to increase the number of users accessing online terms and conditions (T&Cs). And does accessing T&Cs actually make a difference to how people behave online?
Published: 2 September 2024
Published: 30 July 2024
This is a discussion paper by Ofcom’s Behavioural Insight Hub that presents research aimed at understanding the user relationships with social media T&Cs, specifically Community Guidelines.
Published: 8 May 2024
Last updated: 30 July 2024
This online trial tested different ways to encourage children to access age-appropriate user support materials (or help centre) during the sign-up process of a mock social media platform.
Published: 24 May 2024
Published: 21 May 2024
This discussion paper covers two online experiments run by Ofcom’s Behavioural Insight Hub that tested how platform choice architecture affects use of content controls among adult users.
This research was designed to better understand the experiences of and attitudes towards reporting among children who use social media and/or video-sharing platforms.
This review builds Ofcom's understanding of children and young people's development stages, and how this interacts with their online decision making and behaviour.
Published: 11 October 2022
Last updated: 8 May 2024
Research to understand to what extend children are bypassing age assurance measures.
Published: 15 April 2024
Ofcom's areas of interest for future research in the online safety space.
Published: 15 March 2024
This research explores the pathways through which children encounter violent content online, the impact this can have, and perceptions and use of safety measures.
This research explores the pathways through which children encounter content related to suicide, self-harm and eating disorders, the impact this can have, and perceptions of safety measures.
This research explores the pathways through which children are exposed to cyberbullying, the impact this can have, and perceptions of what works to address cyberbullying.
Published: 12 March 2024
Published: 1 February 2024
Understanding people's experiences of encountering fraud and scams on landlines, mobile phones, apps and other online channels.
Published: 31 January 2024
A study on the prevalence of non-suicidal self injury, suicide, and eating disorder content accessible by search engines.
Published: 14 December 2023
How TikTok, Twitch and Snap try to prevent children from watching potentially harmful videos.
Published: 6 November 2023
Last updated: 10 November 2023
The British Board of Film Classification has carried out research into online pornography services, focusing on the way pornographic content is published and displayed and how easily it can be accessed.
How UK internet users aged 16+ experience accessing pornographic content online and passing any age checks if encountered.
Published: 9 November 2023
Research into how children communicate online and the scale and nature of any unwanted contact. Also looks at how children communicate while gaming and using apps designed to connect them to new people.
Published: 19 September 2023
We commissioned two reports from the Institute of Strategic Dialogue (ISD) to build our understanding of user experiences of online services as we prepare to take on new duties under the Bill, with a particular focus on online terrorism, violence and hate.
Published: 18 September 2023
This research report provides an evidence base regarding the prevalence and ease of access to content that contains an apparent offer to sell or supply a range of potentially prohibited items or articles across four categories: knives and bladed weapons, firearms, controlled drugs and psychoactive substances
This report summarises the findings from research conducted by Ofcom on the existence of content, accessible via general search services1, relating to offers to supply articles (information or items) for use in the commission of fraud ('the offence').
Published: 14 September 2023
Content moderation processes - an overview of the processes and trade-offs to mitigate harm.