Video-sharing platform BitChute is improving its safety measures after Ofcom raised concerns about their effectiveness following a mass shooting that took place last year in Buffalo, New York.
After the attack was livestreamed online, versions of the footage spread across multiple online services, including BitChute – potentially exposing people to terrorist content.
Ahead of taking on new powers when the Online Safety Bill becomes law, Ofcom already regulates video-sharing platforms (VSPs) based in the UK.
What Ofcom did following the Buffalo attack
When it became apparent the Buffalo attack had been livestreamed, we immediately arranged to meet senior representatives of regulated VSPs to understand the measures they had in place to protect their users from these types of videos. In October 2022, we published a report setting out our findings.
We raised concerns with BitChute that its reporting and flagging, and content moderation measures may not have been effectively protecting users from encountering videos related to terrorism. Although BitChute has terms and conditions relating to hate and terror content, the Buffalo attack exposed deficiencies in its ability to effectively enforce them.
In particular, we were concerned about:
- BitChute’s on-platform reporting function only being available to users who had a registered account; and
- BitChute’s content moderation team being small and limited to certain working hours, which restricted its ability to respond quickly to reports that footage was on the platform following the attack.
What BitChute is doing in response to our concerns
Following our engagement, BitChute:
- is tripling the size of its moderation team by taking on more human moderators;
- is increasing the number of hours that moderators are available to review reports so that it has a safety team operational 24/7; and
- has changed the design of its platform to allow non-registered users to directly report potentially harmful content.
BitChute is collecting additional metrics to measure the impact of the changes it has made, including the number of content review reports raised each day and average response time to them. These metrics will help Ofcom evaluate the effectiveness of the platform’s measures.
BitChute also became an official member of Tech Against Terrorism in October 2022, and a member of the Global Internet Forum to Counter Terrorism in June 2023.
Next steps
While we welcome these improvements, we are aware of reports alleging that content likely to incite violence and hatred continues to be uploaded to BitChute, can be accessed easily, and may pose significant risks to users.
The current VSP regime – and the future online safety regime – focus on the measures that platforms have in place to minimise the risks of users encountering harmful material online. That means the presence of harmful content on a service is not in itself a breach of the rules, and Ofcom’s role as regulator is not to determine the acceptability of individual pieces of content. However, such content can be indicative of an underlying issue with the user protections in place.
We will closely monitor the implementation of BitChute’s changes and the impact they have, to assess whether they result in tangible improvements to user safety. If we find that, despite BitChute’s improvements, users are not being adequately protected from relevant harmful material, we will not hesitate to take further action, including formal enforcement action if necessary.