Online safety rules: what you need to know
Ofcom is the regulator for online safety in the UK, under the Online Safety Act. Our job is to make sure online services, like sites and apps, meet their duties to protect their users.
Regulated services will have to follow various rules, including protecting users from illegal content and activity online, as well as protecting children from harmful content and activity online. Some services might have other duties to meet, so that:
- people have more choice and control over what they see online;
- services are more transparent and can be held to account for their actions; and
- services protect freedom of expression.
We'll make sure that regulated services follow the rules
Our role is to make sure that regulated services take appropriate steps to protect their users. We don’t require companies to remove particular posts, images or videos, or to remove particular accounts. Our job is to build a safer life online by improving the systems companies use to prevent harm.
We will have a range of tools to make sure services follow the rules. After consulting on them, we will set codes of practice and give guidance on how services can comply with their duties. The new rules will come into force once the codes and guidance are approved by Parliament.
Under the new rules, we will have powers to take enforcement action, including issuing fines to services if they fail to comply with their duties. Our powers are not limited to service providers based in the UK.
In carrying out our regulatory responsibilities, we always take account of users’ rights to privacy and freedom of expression.
The rules are different for different types of online service
Use the tabs below to find out what rules apply to each type of service:
User-to-user services allow people to generate and share content for other people to see. They include:
- social media services;
- video-sharing services;
- private messaging services;
- online marketplaces;
- dating services;
- review services;
- file- and audio-sharing services;
- discussion forums;
- information-sharing services; and
- gaming services.
The online safety rules
Under the Online Safety Act, regulated user-to-user services will need to identify risks of harm to their users, and take steps to protect them from illegal content.
Services that children in the UK are likely to use will also need to identify risks to children and take appropriate steps to protect them from some types of harmful content. Providers based outside the UK might still have to follow our rules if they have links to the UK – that is, if their service has a significant number of users in the UK or where the UK is one of its target markets.
A search service is an online service that has a search engine, allowing you to search more than one website or database for information, websites, or other content.
There are two main types of search service: general search services and vertical search services.
- General search services allow you to search content from across the web.
- Vertical search services allow you to search for specific products or services offered by different companies, such as flights, credit cards or insurance.
The online safety rules
Under the Online Safety Act, regulated search services will need to identify risks of harm to their users, and take steps to protect them from illegal content.
Search services that children are likely to use will also need to identify risks to children and take appropriate steps to protect them from some types of harmful content. Providers that are based outside the UK might still have to follow our rules if they have links to the UK – that is, if their service has a significant number of users in the UK or where the UK is one of its target markets.
Video-sharing platforms (VSPs) are online services that allow users to upload and share videos with other people. Most VSPs, like YouTube and Instagram, will have to follow the new online safety rules, and will have the same duties as other user-to-user services. But some VSPs (those established in the UK) are already bound by separate rules. These include Twitch, TikTok and Snapchat.
There are specific legal criteria to determine whether a service has the required links to the UK to be a UK-regulated VSP. If a service meets these criteria, they must notify Ofcom. We have a list of notified platforms to which these rules apply.
In the future, UK VSPs will have to follow the same rules as other online services under the Online Safety Act.
VSPs must protect their users from harmful content
Our job is to make sure UK-established VSPs have in place appropriate measures to protect users from videos which:
- might impair the physical, mental or moral development of under-18s;
- are likely to incite violence or hatred against particular groups; and/or
- directly or indirectly encourage acts of terrorism; show or involve conduct that amounts to child sexual abuse; and show or involve conduct that incites racism or xenophobia.
They must also meet certain standards around advertising.
Appropriate measures may include: terms and conditions; reporting and flagging functions; viewer rating systems; age verification; parental control functions; complaints procedures; or media literacy tools and information.
We have the power to request information from VSPs on how they deal with harms and we can take enforcement action if they break the rules.
You can find more information about the rules that UK-established VSPs must follow in our guidance for providers (PDF, 237.9 KB).
Services with pornographic content
Services with pornographic content include online services that publish or display certain pornographic content in the form of videos, images or audio. They also include services which allow users to upload and share pornographic content which can be viewed by other users of the services. These services could be user-to-user sites and apps or video-sharing platforms.
The online safety rules
Under the Online Safety Act, services that publish or display pornography must have highly effective age assurance measures so that children cannot normally access pornography on their service. Providers that are based outside the UK might still have to follow our rules if they have links to the UK – that is, if their service has a significant number of users in the UK or where the UK is one of its target markets.
Services that allow users to upload and share pornographic content with other users might also have to follow the rules for user-to-user services or video-sharing platforms.
Report something you’ve seen online
Contact the service first
Many online services already provide ways for you to report harmful content or behaviour, or complain about something you’ve seen. If you have a problem with something you have seen or experienced on an online service, reporting directly to the service should be your first step.
If you have done that and remain concerned, you can tell Ofcom.
We cannot respond to or investigate individual complaints
While we can’t respond to your complaint, it will help us to assess whether regulated services are doing enough to protect their users – and if we should take any action.
You can complain to us about regulated online services. Regulated online services include:
- user-to-user services (that is, sites and apps that host user-generated content, like social media);
- search engines;
- services with pornographic content; and
- video-sharing platforms.
Call 999 in life-threatening emergencies
You should call the police when a crime is in progress, or when someone is in immediate danger.
Other sources of support
There are helplines and support services that can help you if you have seen illegal, harmful or upsetting content online.