Regulating video-sharing platforms: what you need to know
The government introduced legislation in autumn 2020 giving Ofcom powers to regulate UK-established video-sharing platforms (VSPs). These powers came into force on 1 November 2020.
VSP regulation sets out to protect users of VSP services from specific types of harmful material in videos. This includes protecting under-18s from potentially harmful material and all users from material inciting violence or hatred, and content constituting criminal offences relating to terrorism; child sexual abuse material; and racism and xenophobia. VSPs are also required to ensure certain standards around advertising are met.
For more information on the regulations, read our guide (PDF, 237.9 KB).
Video-sharing platforms are a type of online video service. They allow users to upload and share videos with other people and engage with a wide range of content and social features.
Our job is to make sure VSPs which fall within our jurisdiction take appropriate measures to protect under-18s from potentially harmful content and all users from incitement to hatred and violence and content constituting criminal offences relating to terrorism; child sexual abuse material; and racism and xenophobia.
We’ll be developing regulatory guidance on the risks of harm to users and the measures VSPs should take to mitigate them. We recently published a draft of this guidance for consultation.
You can read more about our approach in our short guide (PDF, 237.9 KB).
Ofcom has published a list of notified VSPs that we currently regulate.
VSPs must assess whether they fall under the regulations and come under UK jurisdiction, and they then must notify Ofcom. We have published guidance to help them do this.
The VSP framework derives from the revised EU Audiovisual Media Services (AVMS) Directive. In addition to considering UK jurisdiction under Ofcom’s guidance, providers may wish to refer to the AVMS Directive to determine whether their service may fall under the jurisdiction of a European Member State.
If VSPs break the rules, we can enforce a financial penalty of up to 5% of their qualifying revenue or £250k (whichever is greater).
No. Freedom of expression is central to our democracy, values and modern society. We do not have powers or duties to moderate content – such as removing individual videos. Our role is to make sure regulated services are taking the appropriate steps to protect their users from harmful content, such as incitement to hatred and violence.
We’ll be making sure the measures VSPs adopt to protect users are appropriate and proportionate, taking into account the legitimate interests at stake. If the measures taken by platforms are found not to be effective in protecting users from harmful content, we’ll take action, including formal enforcement action when appropriate.
You can make a complaint about VSPs in scope of our regulation. Our role is to make sure providers have appropriate measures in place to protect users. Complaints from the public will help identify potential issues with compliance but we do not resolve individual complaints.
You should always complain directly to the video-sharing platform in question if you have concerns about harmful content on the platform.
Regulation of UK-established video-sharing platforms will be in place until the Government’s proposed new Online Safety Bill comes into force. The Government’s broader online safety legislation is expected to apply to a much wider range of online services, including services which are not based in the UK. Last year, the Government announced that it intends to appoint Ofcom as the online safety regulator.
You can read more in the Government's draft Online Safety Bill.