One in three video-sharing users find hate speech

24 March 2021

A third of people who use online video-sharing services have come across hateful content in the past three months, according to new Ofcom research.

The news comes as we propose new guidance for sites and apps known as video-sharing platforms (VSPs), setting out practical steps to protect users from harmful material.

VSPs are a type of online video service where users can upload and share videos with other members of the public. They allow people to engage with a wide range of content and social features.

Under laws introduced by Parliament last year, VSPs established in the UK must take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or hatred, as well as certain types of criminal content. It is our job enforce these rules and hold VSPs to account.

Today’s guidance is designed to help these companies understand what is expected of them under the new rules, and to explain how they might meet their obligations in protecting users from harm.

Harmful experiences uncovered

We researched how people in the UK use VSPs, and their exposure to potentially harmful content. Our major findings are:

60% of VSP users are unaware of safety measures to protect them from offensive, violent, or inappropriate videos or behaviour.
  • Hateful content. A third of users say they have witnessed or experienced hateful content. Hateful content was most often directed towards a racial group, followed by religious groups, transgender people and those of a particular sexual orientation.
  • Bullying, abuse and violence. A quarter of users said they had been exposed to bullying, abusive behaviour and threats, and the same proportion had seen violent or disturbing content.
  • Racist content. One in five users say they witnessed or experienced racist content, with levels of exposure higher among users from minority ethnic backgrounds compared to users from a white background.
  • 31% of users will not report potential harm because they don't think it will make a difference. 30% feel that they are not directly impacted, while 25% claim they didn't know what to do or who to report it to.
  • Most users encounter potentially harmful videos of some sort: Seven in ten VSP users say they have been exposed to a potentially harmful experience in the last three months, rising to almost eight in ten among 13- to 17-year-olds.
  • Low awareness of safety measures. Six in ten VSP users are unaware of platforms’ safety and protection measures, while only a quarter have ever flagged or reported harmful content.

We have also published two research reports from leading academics from The Alan Turing Institute, covering online hate; and from the Institute of Connected Communities at the University of East London, on protecting young people online.

Guidance for protecting users

As we begin our new role regulating video-sharing platforms, we recognise that the online world is different to other regulated sectors. Reflecting the nature of video-sharing platforms, the new laws in this area do not set standards for content. Instead, they focus on measures providers must consider taking to protect their users – and companies can be flexible in how they do this.

The massive volume of online content means it is impossible to prevent every instance of harm. Instead, we expect VSPs to take active measures against harmful material on their platforms. This new guidance is designed to help them in making judgements about how best to protect their users. In line with the legislation, our guidance proposes what all video-sharing platforms should provide.

  • Clear rules around uploading content. VSPs should have clear, visible terms and conditions which prohibit users from uploading the types of harmful content set out in law. These should be enforced effectively.
  • Easy flagging and complaints for users. Companies should use tools that allow users to quickly and effectively report or flag harmful videos. They should also be clear about  how quickly they will respond, and be transparent about any action taken. Providers should offer a route for users to formally raise issues or concerns, and to challenge decisions through dispute resolution. This is vital to protect the rights and interests of users who upload and share content.
  • Restricting access to adult sites.VSPs with a high prevalence of pornographic material should put in place effective age-verification systems to restrict under-18s’ access to these sites and apps.

Enforcing the rules

Ofcom’s approach to enforcing the new rules will build on our track record of protecting audiences from harm, while upholding freedom of expression. We will consider the unique characteristics of user-generated video content, alongside the rights and interests of users and service providers, and the general public interest.

If we find a VSP provider has not taken appropriate measures to protect users, we have the power to investigate and take action against it. This could include fines, requiring the provider to take specific action, or – in the most serious cases – suspending or restricting the service.Consistent with our general approach to enforcement, we may, where appropriate, seek to resolve or investigate issues informally first, before taking any formal enforcement action.

Sharing videos has never been more popular, something we’ve seen among family and friends during the pandemic. But this type of online content is not without risk, and many people report coming across hateful and potentially harmful material.

Although video services are making progress in protecting users, there’s much further to go. We’re setting out how companies should work with us to get their houses in order – giving children and other users the protection they need, while maintaining freedom of expression.

Kevin Bakhurst, Ofcom’s Group Director for Broadcasting and Online Content