Ofcom’s policy lead for illegal harms, hate and terrorism talks about his role in tackling online harm and recent work examining how platforms responded to the livestreamed attack in Buffalo, New York in May 2022.
With the Online Safety Bill on the horizon, Ofcom is gearing up to help create a safer life online for everyone. We spoke to Murtaza Shaikh, who leads our regulatory approach to several illegal harms, about why he’s passionate about protecting people from harmful content online.
“What motivates me is that I want whatever I do to have a societal impact,” he said. “I think if you can identify and deal with hate speech or hatred against other groups that are different, then you're in a very good place to stop much worse things happening.”
After achieving a masters in international human rights law, Murtaza said he had the option to go into international criminal law but chose instead to focus on minority rights. He went on to work at international courts, think tanks and for the UN Special Rapporteur on minority issues before joining Ofcom’s online safety policy team in 2021.
“I'm actually more interested in preventing hate, and the online space is where it's happening. If we can limit or deal with hate in the online world then we can have an impact on the real world.”
Report following the Buffalo terrorist attack
Murtaza recently led on Ofcom’s report examining how online platforms responded to the livestreamed terrorist attack carried out in Buffalo, New York on 14 May 2022. As the regulator for video-sharing platforms (VSPs), we sought to find out what tech firms can learn from the tragic incident.
Murtaza’s team, which included Laura Nettleton, Ciaran Cartmell, Danny Morris and Meerah Nakaayi, identified a number of lessons platforms can learn to better protect users from terrorist and hateful content – and help prevent such content from spreading across multiple online platforms.
“There is scope for platforms to put more collective effort into ensuring their services are sufficiently robust against exploitation by terrorists, particularly by embedding user-safety considerations early in their product design and engineering processes,” he said.
Murtaza and his team carry out other important work, including building Ofcom’s regulatory approach to illegal harms, hate and terrorism on VSPs and preparing for Online Safety regulation. The insights from Ofcom’s review into the Buffalo incident will be taken forward as we prepare to take on these new duties - including the development of Codes of Practice relating to illegal content and associated risk assessment guidance.
Challenges in tackling online hate
Tackling harms in a changing online landscape comes with challenges. According to Murtaza, one of these is that hate spans a wide spectrum of severity and can emanate from so many different sources.
“The other challenge is that when it comes to defining and identifying hate, it's very much subject to context and can’t always be identified by AI systems. There’s also a number of online harms that happen on a large scale such as disinformation and hate, whereas terrorist content online is intrinsically severe, but not rolling out to the same extent on public forums. So that makes it very challenging.”
So how do we respond to that?
“We recruit specialists across different areas we want to keep abreast of, so we have a team of people with a background in these sorts of topics. For example, both Laura and Danny joined us with a background in counterterrorism. The Buffalo report also demonstrated how we work with networks of experts in the UK and globally.
“Our priority is to continue to engage with stakeholders that represent some of the communities most impacted by online harm, and to work with platforms to improve their policies and systems to protect people online.
“Ultimately, we can’t lose sight of the fact that online safety is meant to keep children safer, and it’s meant to protect people from all sorts of harm – including threats to people’s lives and terrorism. Online safety is everyone’s safety.”