Check how to comply with the protection of children rules

Published: 21 May 2025

Before starting your children’s risk assessment, you must have previously completed a children’s access assessment.

If you have established that your service is likely to be accessed by children by completing the children’s access assessment, you will need to complete a children’s risk assessment before 24 July 2025.

Our interactive tool will help providers of user-to-user and search services to understand how to comply with the children’s risk assessment duties. 

Start now

Using the tool to help you comply

Based on your answers to questions asked within the tool, we will provide you with compliance recommendations for your service. When you start using the tool, you'll be given a unique reference code so you can save your progress and return at any time.

The tool can be used alongside our protection of children duties record-keeping template (ODT, 50 KB) to help you record the necessary information.

Meeting the legal requirements remains your responsibility

Using the tool does not guarantee that you are compliant with the Online Safety Act. While Ofcom provides guidance and recommendations, we do not have access to the evidence about risks on your service to ensure you assess them correctly. You are responsible for implementing the safety measures required, meeting your legal duties and keeping the appropriate records. You should seek independent specialist advice if you need it.

It is up to you to decide how you meet your legal duties under the Online Safety Act, but this must include keeping the records required by the law. Most services do not need to send their records to Ofcom but should be aware that we can ask for them at any time. 

Relevance of the illegal content risk assessment

The children’s risk assessment duty is separate from the illegal content risk assessment duty.

If you are in scope of the Act, you will have to complete an illegal content risk assessment as well as a children’s risk assessment if your service is likely to be accessed by children. You must carry out risk assessments for both duties and hold separate records for each risk assessment.

You will identify different Codes measures to reduce the risk of illegal content or content harmful to children as a result of the respective risk assessments.

When considering evidence and making assessments, you should note that some categories of content harmful to children may be related to or overlap with some categories of illegal content. For example, certain content that promotes, encourages or provides instructions for suicide may, in addition to being primary priority content, also amount to priority illegal content (encouraging or assisting suicide) for the purposes of the illegal harms risk assessment.

Similarly, some characteristics of the service that affect the level of risk of illegal harm (such as user base, functionalities, and the ways in which a service is used) are also likely to be relevant for your children’s risk assessment. Should you conduct risk assessments concurrently, you still need to ensure that the illegal content risk assessment and the children’s risk assessment are distinct and clearly identifiable.

Finally, the findings of your illegal content risk assessment may inform your children’s risk assessment – for example, evidence on illegal hate may also support your assessment of content that incites hatred for the children’s risk assessment.

Keeping your data secure

To give you the most relevant information, we will ask you questions about the features of your service, your users, and the risk levels you have assigned to content harmful to children.

We will store this information but it will not be attributed to you or your service. We will use it to understand more about services’ online safety practices in aggregate and to improve our tools in the future.

In future, we may collect identifiable information about online services to send notifications or allow you to access additional resources, but this would always be optional when using this service. 

The legal obligations this tool can help you with

These are:

  • the children’s risk assessment duties in sections 11 and 28 of the Online Safety Act 2023
  • the children’s safety duties in section 12 and section 29, and content reporting and complaints duties under sections 20-21 and 31-32.
  • the record-keeping and review duties in sections 23 and 34 of the Act

For more detailed information, you can refer to official documents setting out our policies:  

The law requires that your children’s risk assessment is “suitable and sufficient”. Our guidance explains that this means:

  • your risk assessment must include all the elements of a children’s risk assessment specified in the Act section 11 for user-to-user servicesand section 28 for search service
  • it should be specific to your service and reflect the risks accurately

The specific elements of the risk assessment are set out in sections 11 and 28  of the Online Safety Act 2023. These include your obligation to:

  • Assess the risk of children encountering each kind of primary priority content, each kind of priority content, and non-designated content
  • Take into account Ofcom’s Children’s Risk Profiles
  • Consider the characteristics of your service: its user base (e.g., user numbers, different age groups of children, languages, groups at risk, and groups increasing risk), functionalities (including those that enable adults to search for or contact other users, including children), algorithmic systems (and how easily, quickly and widely they disseminate content) and the business model
  • Consider any other relevant aspects of your service’s design and operation, including any existing controls to mitigate harm such as governance, use of proactive technology, measures to promote users’ media literacy and safe use of your service, and other systems and processes which could affect the level of risk to children
  • Consider how your service is used – for example, both the intended and unintended ways that children may use the service, and functionalities and features that affect how much children use the service. If we suspect that you have failed to carry out a suitable and sufficient children’s risk assessment, then we are able to take enforcement action. Any decision we take regarding enforcement action would be made in line with our Online Safety Enforcement Guidance.

If we decide to open an investigation and find that your service has failed to comply with its duties, we may impose a penalty of up to 10% of qualifying worldwide revenue or £18 million (whichever is the greater) and require remedial action to be taken. 

Based on the findings of your children’s risk assessment, you will need to put in place the appropriate safety measures for your service to comply with the children’s safety duties. 

If you provide a user-to-user service, broadly the safety duties require you to take proportionate measures to: 

  • Prevent children of any age from encountering pornographic suicide, self-harm, and eating disorder content (primary priority content). If your service does not prohibit one or more kinds of primary priority content for all users, this involves using highly effective age assurance to prevent children from encountering such content where it is identified on their service.
  • Protect children in age groups judged to be at risk of harm from other harmful content from encountering it. This includes content that is abusive or incites hatred, bullying content, violent content, and content which encourages, promotes, or provides instructions for dangerous stunts and challenges, and self-administering harmful substances (priority content), as well as other types of content that present a material risk of significant harm to an appreciable number of children in the UK (non-designated content).
  • Mitigate and manage the risks of harm to children in different age groups identified in your children’s risk assessment.
  • Mitigate the impact of harm to children in different age groups presented by content that is harmful to children.
  • Explain how you’ll do this in your terms of service.
  • Allow people to easily report content that is harmful to children and operate a complaints procedure.

If you provide a search service, broadly the safety duties require you to: 

  • Minimise the risk of children of any age encountering the most harmful search content to children, namely pornography, suicide, self-harm, and eating disorder content (primary priority content).

  • Minimise the risk of children in age groups judged to be at risk of harm from other harmful content from encountering it. This includes content that is abusive or incites hatred, bullying content, violent content and content which encourages, promotes, or provides instructions for dangerous stunts and challenges, and self-administering harmful substances (priority content), as well as other types of content that present a material risk of significant harm to an appreciable number of children in the UK (non-designated content).

  • Mitigate and manage the risks of harm to children in different age groups identified in your children’s risk assessment.

  • Mitigate the impact of harm to children in different age groups presented by content that is harmful to children.

  • Explain how you’ll do this in a publicly available statement.

  • Allow people to easily report content that is harmful to children and operate a complaints procedure.

You can decide for yourself how to meet the specific legal duties, but one way to comply is to use the measures set out in Ofcom’s Codes. 

Our Codes set out a range of measures in areas including content moderation, complaints, user access, design features to support users, and the governance and management of online safety risks. 

This tool will help you decide which of these apply to your service. 

Back to top