Guide for services: complying with the Online Safety Act

Published: 27 February 2024
Last updated: 25 February 2025

Mae’r Ddeddf Diogelwch Ar-lein yn gwneud busnesau, ac unrhyw un arall sy’n gweithredu ystod eang o wasanaethau ar-lein, yn gyfrifol yn gyfreithiol am gadw pobl (yn enwedig plant) yn ddiogel ar-lein yn y DU.

I bwy mae'r rheolau newydd yn berthnasol

Mae’r rheolau newydd yn berthnasol i’r holl wasanaethau sydd o fewn y cwmpas ac sydd â nifer sylweddol o ddefnyddwyr yn y DU neu sy’n targedu marchnad y DU, waeth lle maen nhw wedi'u lleoli.

Mae’r rheolau’n berthnasol i wasanaethau sydd ar gael dros y rhyngrwyd (neu ‘wasanaethau ar-lein’). Gallai hyn fod yn wefan, ap neu fath arall o lwyfan. Os ydych chi neu eich busnes yn darparu gwasanaeth ar-lein, efallai y bydd y rheolau’n berthnasol i chi.

Yn benodol, mae’r rheolau’n ymwneud â gwasanaethau lle:

  • gall pobl ddod ar draws cynnwys (fel delweddau, fideos, negeseuon neu sylwadau) sydd wedi cael ei gynhyrchu, ei lwytho i fyny neu ei rannu gan ddefnyddwyr eraill. Ymysg pethau eraill, mae hyn yn cynnwys negeseuon preifat a gwasanaethau sy’n caniatáu i ddefnyddwyr lwytho cynnwys pornograffig i fyny, ei gynhyrchu neu ei rannu. Mae’r Ddeddf yn galw’r rhain yn ‘wasanaethau defnyddiwr-i-ddefnyddiwr’;
  • gall pobl chwilio drwy wefannau neu gronfeydd data eraill (‘gwasanaethau chwilio’);
  • rydych chi neu eich busnes yn cyhoeddi neu’n arddangos cynnwys pornograffig.

I roi ychydig o enghreifftiau, gallai gwasanaeth ‘defnyddiwr-i-ddefnyddiwr’ fod yn unrhyw un o’r canlynol:

  • gwefan neu ap cyfryngau cymdeithasol;
  • gwasanaeth rhannu lluniau neu fideos;
  • gwasanaeth sgwrsio neu negeseua gwib, fel ap cwrdd â chariad; gwasanaeth gemau ar-lein neu symudol.
  • Mae’r rheolau’n berthnasol i sefydliadau mawr a bach, cwmnïau mawr sydd â llawer o adnoddau neu ‘ficrofusnesau’ bach iawn. Maen nhw hefyd yn berthnasol i unigolion sy’n rhedeg gwasanaeth ar-lein.

Mae eich lleoliad chi neu leoliad y busnes yn amherthnasol. Bydd y rheolau newydd yn berthnasol i chi (neu eich busnes) os oes gan y gwasanaeth rydych chi’n ei ddarparu nifer sylweddol o ddefnyddwyr yn y DU, neu os yw’r DU yn farchnad darged.

Specifically, the rules cover services where:

  • people may encounter content (like images, videos, messages or comments), that has been generated, uploaded or shared by other users. Among other things, this includes private messaging, and services that allow users to upload, generate or share pornographic content. The Act calls these ‘user-to-user services’;
  • people can search other websites or databases (‘search services’); or
  • you or your business publish or display pornographic content.

To give a few examples, a 'user-to-user' service could be:

  • a social media site or app;
  • a photo- or video-sharing service;
  • a chat or instant messaging service, like a dating app; or
  • an online or mobile gaming service.

The rules apply to organisations big and small, from large and well-resourced companies to very small ‘micro-businesses’. They also apply to individuals who run an online service.

It doesn’t matter where you or your business is based. The new rules will apply to you (or your business) if the service you provide has a significant number of users in the UK, or if the UK is a target market.

Check if the Online Safety Act applies to you

Use our tool to find out if the rules are likely to apply to you, and what to do next.

Start now

Check how to comply with the illegal content rules

If the Online Safety Act applies to you, you will need to complete an illegal content risk assessment by 16 March 2025.

Use our tool to help you to complete a risk assessment and comply with your safety obligations.

Start now

Comply with the protection of children rules

Services likely to be accessed by children will be required to carry out children’s risk assessments from 24 April 2025. If you provide a service now, the deadline to complete your first children’s risk assessment is 24 July 2025.
 
We will be launching a tool to help you complete your children’s risk assessment and comply with safety obligations.

Comply with rules about online pornography

If you or your business has an online service that hosts pornographic content, there are rules you will need to follow to prevent children from accessing it.

Subscribe for updates about online safety

Subscribe for updates on any changes to the regulations and what you need to do.

Other important things you should know

Ofcom is the regulator for online safety. We have a range of powers and duties to implement the new rules and ensure people are better protected online. We have published our overall approach and the outcomes we want to achieve.

The Act expects us to help services follow the rules – including by providing guidance and codes of practice. These will help you understand how harm can take place online, what factors increase the risks, how you should assess these risks, and what measures you should take in response. We will consult on everything we’re required to produce before we publish the final version.

We want to work with you to keep adults and children safe. We’ll provide guidance and resources to help you meet your new duties. These will include particular support for small to medium-sized enterprises (SMEs).

But if we need to, we will take enforcement action if we determine that a business is not meeting its duties – for example, if it isn’t doing enough to protect users from harm.

We have a range of enforcement powers to use in different situations: we will always use them in a proportionate, evidence-based and targeted way. We can direct businesses to take specific steps to come into compliance. We can also fine companies up to £18m, or 10% of their qualifying worldwide revenue (whichever is greater).

In the most severe cases, we can seek a court order imposing “business disruption measures”. This could mean asking a payment or advertising provider to withdraw from the business' service, or asking an internet service provider to limit access.

We can also use our enforcement powers if you fail to respond to a request for information.

You can find more information about out enforcement powers, and how we plan to use them, in our draft enforcement guidance.

The new rules cover any kind of illegal content that can appear online, but the Act includes a list of specific offences that you should consider. These are:

  1. terrorism offences;
  2. child sexual exploitation and abuse (CSEA) offences, including grooming and child sexual abuse material (CSAM);
  3. encouraging or assisting suicide (or attempted suicide) or serious self-harm offences;
  4. harassment, stalking, threats and abuse offences;
  5. hate offences;
  6. controlling or coercive behaviour (CCB) offence;
  7. drugs and psychoactive substances offences;
  8. firearms and other weapons offences;
  9. unlawful immigration and human trafficking offences;
  10. sexual exploitation of adults offence;
  11. extreme pornography offence;
  12. intimate image abuse offences;
  13. proceeds of crime offences;
  14. fraud and financial services offences; and
  15. foreign interference offence (FIO).

Our consultation includes our proposed guidance for these offences, while our draft register of risks (PDF, 3.2 MB) (organised by each kind of offence) looks at the causes and impact of illegal harm online.

All user-to-user services and search services will need to:

  • carry out an illegal content risk assessment – we will provide guidance to help you do this;
  • meet your safety duties on illegal content – this includes removing illegal content, taking proportionate steps to prevent your users encountering it, and managing the risks identified in your risk assessment – our codes of practice will help you do this;
  • record in writing how you are meeting these duties – we will provide guidance to help you do this;
  • explain your approach in your terms of service (or publicly available statement);
    allow your users to report illegal harm and submit complaints.

One way to protect the public and meet your safety duties is to adopt the safety measures we set out in our codes of practice. We’re currently consulting on our draft codes for illegal harms. The draft code covers a range of measures in areas like content moderation, complaints, user access, design features to support users, and the governance and management of online safety risks.

In our draft codes, we have carefully considered which services each measure should apply to, with some measures only applying to large and/or risky services. The measures in our codes are only recommendations, so you can choose alternatives. But if you do adopt all the recommended measures that are relevant to you, then you will be meeting your safety duties.

If the rules apply to your service, then you will need to protect children from harm online. Some of these responsibilities are covered under the illegal content duties – such as tackling the risk of child sexual exploitation and abuse offences, including grooming and child sexual abuse material.

Some types of harmful content (which aren’t illegal) are covered by the children’s safety duties. These only apply if your online service can be accessed by children.

The Act specifies many types of content that are harmful to children, including:

  • pornographic content;
  • content which encourages, promotes or provides instruction for:
    • suicide;
    • deliberate self-injury;
    • eating disorders;
    • an act of serious violence against a person;
    • a challenge or stunt highly likely to result in serious injury to the person who does it, or to someone else;
  • content that is abusive or incites hatred towards people based on characteristics of race, religion, sex, sexual orientation, disability or gender reassignment;
  • bullying content;
  • content that depicts real or realistic serious violence against a person, or serious injury of a person;
  • content that depicts real or realistic serious violence against an animal, or serious injury of an animal; and
  • content that encourages a person to ingest, inject, inhale or in any other way self-adminster a physically harmful substance, or a substance in such a quantity as to be physically harmful.

The rules require you to treat different kinds of content in different ways, and we will explain this in our future guidance.

Carry out a children’s access assessment

If you provide a user-to-user service or search service, you’ll need to assess whether or not children can access it. If it’s likely they can access your service, then the children’s safety duties will apply.

This is a formal assessment, and Ofcom will provide guidance on how to carry it out. We expect to consult on our proposed approach in Spring 2024. You will need to carry out your first assessment once we have published our final guidance.

Assess the risks and meet your safety duties

If you have assessed and found that the children’s safety duties apply, then you’ll need to:

  • carry out a children’s risk assessment which addresses how content that’s harmful to children could be encountered on your service – we will provide guidance to help you do this;
  • fulfil your children’s safety duties – these include preventing children from encountering certain kinds of harmful content, other measures to protect them from harm, and managing the risks identified in your risk assessment – our children's codes of practice will help you do this;
  • record how you are meeting these responsibilities in writing – we will provide guidance to help you do this;
  • explain your approach in your terms of service (or publicly available statement); and
  • allow your users to report harmful content and submit complaints.

One way to protect the public and meet your safety duties is to adopt the measures we set out in our children’s codes of practice. These will be specific measures for protecting children.

We expect to consult on our draft children’s codes of practice in Spring 2024, with the legal duties coming into force once we have finalised our approach.

Under the rules, a very small number of online services will be designated as 'categorised services'. These services will have additional duties to meet.

A service will be categorised according to the number of people who use it, and the features it has. Depending on whether your service is designated as Category 1, 2A or 2B, you might be expected to:

  • produce transparency reports about their online safety measures;
  • provide user empowerment tools – including giving adult users more control over certain types of content, and offering adult users the option to verify their identity;
  • operate in line with your terms of service;
  • protect certain types of journalistic content; and/or
  • prevent fraudulent advertising.

In July 2023, we invited evidence to inform our approach to categorisation. We will advise the Government on the thresholds for these categories, so it can then make laws on categorisation. We expect to publish a register of these categorised services in late 2024. For more information, see our roadmap.

When implementing safety measures and policies – including on illegal harm and the protection of children – you will need to consider the importance of protecting users’ privacy and freedom of expression.

Ofcom will consider any risks to these rights when preparing our codes of practice and other guidance, and include appropriate safeguards.

Cynnwys cysylltiedig

Rate this page

Was this page helpful?
Back to top