Quick guide to online safety codes of practice

09 November 2023

Under the Online Safety Act, providers of online services have new duties to keep people safe from harm. One way they can do that is to adopt the safety measures in Ofcom's codes of practices.

We're currently consulting on our draft codes for illegal harms. This page gives a quick introduction to the measures we have proposed for our first codes.

You don’t need to do anything right now, but we’ve suggested some steps to help you get ready.

We are consulting on our proposals, so this information could change

This page:

  • summarises proposals we are consulting on now – we will update this information when final documents are in place;
  • is only meant to introduce your online safety duties – our draft codes of practice set out our safety measures in full.

What you can do now

Subscribe to email updates from us. We'll send you the latest information about how we regulate. This includes any important changes to what you need to do. You'll also be the first to know about our new publications and research.

Remove illegal content, manage risks, update your terms

The safety duties for illegal content focus on keeping people safe online. It’s about making sure you have the right measures in place to protect people from harm that could take place on your service.

If you have a user-to-user service, it means you will need to:

  • take proportionate steps to prevent your users encountering illegal content;
  • mitigate and manage the risk of offences taking place through your service;
  • mitigate and manage the risks identified in your illegal content risk assessment;
  • remove illegal content when you become aware of it, and minimise the time it is present on your service; and
  • explain how you’ll do this in your terms of service.

If you have a search service, it means you'll need to:

  • take proportionate steps to minimise the risk of your users encountering illegal content via search results;
  • mitigate and manage the risks identified in your illegal content risk assessment; and
  • explain how you’ll do this in a publicly available statement.

You can decide for yourself how to meet the specific legal duties, but one way to comply is to use the measures set out in Ofcom’s codes.

Our draft codes for illegal content set out a range of measures in areas including content moderation, complaints, user access, design features to support users, and the governance and management of online safety risks.

Some measures are targeted at addressing the risk of certain kinds of offences, such as child sexual abuse material (CSAM), grooming and fraud. Other measures help to address a variety of offences

What you can do now

Get more familiar with online harms and what makes them more likely by reading our draft risk profiles (page 52) (PDF, 764.8 KB).

Our codes of practice set our a range of measures that apply to different services

The Act is clear that the safety measures services you need to put in place should be proportionate. Different measures in the draft Codes would apply to different services based on:

  • the type of service you provide (user-to-user or search);
  • the features of your service;
  • the number of users your service has; and
  • the results of your illegal content risk assessment.

Some measures apply to all services

These include having someone responsible for online safety compliance and ensuring your terms of service (or publicly available statements) are easy to find.

What you can do now

Take a look at our proposed safety measures (PDF, 463.4 KB) and who they apply to.

Some measures apply to large services

Certain measures would apply to larger and/or higher risk services, such as using specific automated tools to detect content which could lead to fraud or the sharing of child sexual abuse material.

In our draft Codes, we have defined a large service as a service which has an average user base of 7 million or more per month in the UK. This is equivalent to approximately 10% of the UK population.

These services may need to put in place more measures, such as providing training for staff working in content moderation.

This is because, generally, large services putting in place these measures will have the most benefits for users – so it’s proportionate to ask them to do more.

What you can do now

If you don’t know it already, calculate the number of monthly UK users for your service.

Other measures apply to services that are medium or high risk

When you complete your illegal content risk assessment, we’ll ask you to decide if you’re low, medium, or high risk for each kind of illegal harm. This rating needs to be as accurate as possible, and we’ve provided draft guidance on how to do it.

Once you’ve assessed each risk, different measures apply to lower risk and higher risk services:

  • If your service is low risk for all harms, we propose to call it a ‘low risk service’ and the minimum number of measures will apply.
  • If your service is medium or high risk for one harm, we propose to call it a ‘single-risk’ service, and more measures may apply.
  • If your service is medium or high risk for two or more harms, we propose to define it as a ‘multi-risk’ service, and further measures may apply.

Some safety measures are focused on specific harms (like hate and harassment offences). These would only apply to services who are medium or high risk for those harms.

What you can do now

Read our quick guide to online safety risk assessments, which introduces our proposed guidance.

If you have views on our proposals, please share them

You can read our draft codes in full and respond to our consultation. If you have views on our proposals, we'd love to hear from you.

In future, we’ll also be consulting on our proposed approach to children’s safety codes.

Rate this page

Was this page helpful?