2025 is the year of action for online services, as new online safety regulations come into force in the UK for the first time. This is the second edition of Ofcom’s online safety industry bulletin, but the first since two major parts of the Online Safety Act became legal obligations, the new rules on illegal content and on protecting children online.
As a result, we are now seeing change in industry gather pace. A wide range of online services have completed their first risk assessments, with Ofcom now analysing more than 60 from large and higher risk services. Publishers of pornography have committed to introduce highly effective age assurance across over 1,300 sites. And tech firms have started to implement new safety measures to protect UK users, from introducing teens accounts, in-app warnings and changing default settings for under 18s, to joining the Internet Watch Foundation to start scanning for known child sexual abuse material.
However, this is only the beginning of the progress we need to see, especially where risks are greatest, and we have launched our first investigations and enforcement programmes to accelerate change.
To support regulated organisations in completing the next steps – the children’s risk assessment and implementing children’s safety measures – we have published a new digital toolkit providing step-by-step guidance, compliance checklists, and tailored recommendations.
Find out more and ask your questions at our second Online Safety Act explained online event on 4 June – register your place.
What's in this bulletin
Taking action on online safety
Online services complete first risk assessments
We welcome the positive responses we’ve seen from regulated organisations to the new illegal content rules. The first wave of online safety risk assessments has been completed by online services, with Ofcom now carrying out its first review of completed assessments.
As a result of these rules, for the first time services have had to properly consider and record risks to UK users, the steps needed to mitigate them, and to set up appropriate governance mechanisms to oversee them, underpinning a safety by design approach.
Deadline passed: illegal content risk assessments
16 March was the deadline for services to complete their first illegal content risk assessments. To date, Ofcom has requested and received over 60 risk assessments, including from a range of large services and smaller services posing particular risks. These records will enable us to monitor compliance and drive improvements across industry in how risks are identified and managed. We will share insights and trends to support understanding of good practice, and identify areas of necessary improvement, later in the year.
Online service providers should have also completed their first children’s access assessment by 16 April. Those services likely to be accessed by children now have until 24 July to complete their children’s risk assessments and keep a record of it. Service providers will be required to share this record with Ofcom if we request it, and we will be in touch with providers whose children’s risk assessment we wish to review soon.
Coming in future: risk assessment duties for categorised services
Once the register of categorised services is established and published by Ofcom, those services categorised as 1 or 2A will have additional duties related to their illegal content and children’s risk assessments. They will need to publish a summary of the findings of their most recent risk assessments, including the risk levels, the nature and the severity of harm on their service, and provide Ofcom with the full record of their assessment.
Pornography sites start rolling out age assurance
Providers of online pornography are implementing highly effective age assurance across thousands of sites, in response to Ofcom’s supervisory engagement and enforcement programme in this area.
Following publication of our Protection of Children Statement, we have written to hundreds of dedicated online pornography services to inform them about the requirement to implement age checks. The deadline for services who allow user-generated content (such as tube, cam or fan sites) is 25 July 2025. Lots of services have written back to us outlining their plans to come into compliance on or before the deadline, and we expect to see these plans being implemented and other pornography services also taking action over the coming weeks.
This follows hundreds of advisory letters sent out earlier in the year to publishers of pornography (such as studios or paysites) who should already be taking steps to introduce age checks. Around 40 publishers have provided details of age assurance methods they are planning on implementing on around 1,300 sites, following Ofcom’s request for their plans. Plans include (but are not limited to) facial age estimation, credit card payment walls and safe for work landing pages. Many services are using a combination of techniques to achieve reliable and robust protection.
However, there are some services that have failed to respond to Ofcom’s requests and do not appear to have taken any steps to implement highly effective age assurance. Therefore, we have decided to open two investigations (see below).
- Read our letter to these user-to-user services that allow pornography, sent in April.
- Read our advisory letter to publishers of pornography, which are typically porn studies and pay sites, sent in January.
- Our Adults Only page provides more information for services that allow pornography. Adult businesses can also contact pornsupervision@ofcom.org.uk for more information.
Ofcom begins enforcing the rules
While many services are taking steps to protect users and comply with their new legal obligations, Ofcom is committed to acting where services fail to do so and consequently pose risks to UK users.
First investigation under illegal content rules
In April, we opened an investigation into the provider of an online suicide discussion forum. We are investigating whether the service provider has failed to comply with illegal content risk assessment duties and put appropriate safety measures in place to protect its UK users from illegal content and activity, among other potential infringements of the Act.
This marked our first investigation under the Online Safety Act and follows an information notice we sent to the service provider under our risk assessment enforcement programme (see below).
Three live enforcement programmes
We have opened a number of enforcement programmes to accelerate change in industry. Generally, an enforcement programme aims to examine a problem or concern that relates to a particular group of firms, or to a whole sector, and/or to monitor compliance with their legal duties. Enforcement programmes can lead to further investigations if we have concerns that a service provider may not be meeting its duties under the Act.
Pornographic content
- In January 2025, we opened an enforcement programme to protect children from encountering pornographic content through the use of highly effective age assurance.
- We have since opened investigations into Itai Tech Ltd – a service which runs the nudification site Undress.cc – and Score Internet Group LLC, which runs the site Scoreland.com. Both sites appear to have no highly effective age assurance in place and are therefore potentially in breach of their duties to prevent children from accessing pornography.
Child sexual abuse material
- In March 2025, we started an enforcement programme into measures being taken by file-sharing and file-storage services to prevent users from encountering or sharing child sexual abuse material (CSAM).
Illegal content risk assessments
- Also in March 2025, we launched an enforcement programme to monitor compliance with the illegal content risk assessment duties and record keeping duties, requesting over 60 risk assessments from a range of large services and smaller services posing particular risks.
- This month, we opened two investigations into Kick Online Entertainment SA in respect of the user-to-user service Motherless.com. We are investigating whether Kick has failed to comply with its statutory duties in relation to its illegal content risk assessment, and Kick's compliance with a statutory information request.
Details of new investigations will be published on our website.
Action required by 24 July – children’s rules now in force
Online services likely to be accessed by children have three months, until 24 July, to assess the risk they pose to children and decide which steps they need to take to manage those risks. This follows the publication of our second set of Codes and guidance under the Online Safety Act on 24 April.
From 25 July, service providers will need to take the steps set down in the Protection of Children Codes of Practice or use other effective measures to protect UK children. As noted above, by this date all services which allow pornography must have highly effective age assurance in place to prevent children from accessing it.
These duties will have a big impact on the online lives of children in the UK, with significant changes required to protect them from serious harms that have become normalised and routine.
The Codes are made up of over 40 practical measures. Some apply to all services, while others depend on the type of service, its functionalities and other characteristics, its number of users, and the outcome of its children’s risk assessment. The measures cover areas including robust age checks, safer algorithms, effective content moderation, user reporting and complaints systems, clear terms of service, and tools and support for children. We will be looking for evidence that providers are putting managing the risk of harm to children at the heart of their decision-making and governance.
- The deadline for services’ children’s access assessment – to determine if these rules apply – was 16 April.
- Read our Protection of Children policy statement.
- Use the digital toolkit to find out how to comply.
What you need to know and do
Upcoming obligations and deadlines
Date |
Action |
Who? |
Legal requirement? |
Useful links |
---|---|---|---|---|
24 July 2025 | Complete children’s risk assessment | Services likely to be accessed by children | Yes |
Quick guide to children's risk assessments |
25 July 2025 | Comply with the children’s safety duties from this date – use measures set out in the Codes or effective alternative measures | Services likely to be accessed by children | Yes* |
Quick guide to protection of children codes |
25 July 2025 | Implement highly effective age assurance to prevent children from accessing pornography | Services that allow pornographic content | Yes* | |
On or shortly after 25 July 2025** | Repeal of UK’s VSP regulations | Services in scope of the UK’s VSP regulations will now be subject to the Online Safety Act in full. Transition period ends. | Yes |
* Subject to the draft Protection of Children Codes completing the Parliamentary process.
** Date to be set by Government.
Amending Illegal Content Codes of Practice – consultation closes 22 July 2025
We are consulting on amendments to our Illegal Content Codes of Practice to expand the application of measures relating to the blocking and muting of user accounts and disabling of comments. The proposals would bring providers of certain smaller user-to-user services that are likely to be accessed by children into scope of these measures where they have relevant risks and functionalities. You can review the proposals and respond to the consultation by 22 July 2025.
Resources and support
Protection of Children digital toolkit launched
Our interactive toolkit can help providers of user-to-user and search services understand how to comply with the Protection of Children rules. Based on your answers to questions, it provides tailored recommendations, as well as step-by-step guidance and compliance checklists.
Start using the toolkit today.
4 June Online Safety Explained industry conference – Protection of Children
On 4 June we will be holding our second OSA Explained conference – this time focusing on the Protection of Children duties. This event will guide you through the new legal duties to protect under-18s online, how to assess risk of harm, and the steps to take to keep them safe. This event will be virtual, with sessions running 10:00-17:25.
Find more details and register your place.
What’s coming up from Ofcom
Next steps on our regulatory roadmap
Additional Safety Measures Consultation – coming in June 2025
In June, we will be publishing our consultation on additional measures for Codes of Practice (illegal harms and protection of children). We encourage all regulated services to read the consultation and respond with your views.
Fees and penalties statement – coming in June 2025
Also in June, we will publish our statement setting out our final decisions to implement the fees regime, which will allow us to fulfil our statutory duty to fund our online safety regulatory work through fees collected from industry. This will also have consequences for the maximum penalty that we can levy on providers who breach their duties under the Act.
Further fees consultations – Q3 2025
In the coming months, we will publish two consultations to supplement the decisions in our statement and give service providers greater clarity on the fees regime. The Qualifying Worldwide Revenue (QWR) Guidance consultation will aid providers in the process of calculating their QWR to determine if they are eligible to pay fees. The Notification Guidance consultation will help providers to comply with their duty to notify us of their eligibility to pay fees. By the end of the year, we will also consult on our Statement of Charging Principles (SoCP), which will set out the principles that we propose to apply when determining the fees that providers will pay.
Super-complaints consultation – later in 2025
The Secretary of State for the Department of Science, Innovation and Technology (DSIT) has consulted on the criteria for an entity to be eligible to make a super-complaint, criteria for a super-complaint to be admissible, and procedural requirements for making a super-complaint. Once the necessary secondary legislation has been laid, we will consult on our guidance about super-complaints, which will include information on how an entity can verify its eligibility and the procedures for making a super-complaint.
Draft and final transparency notices to categorised services – later in 2025
Following publication of the register of categorised services, we will issue draft transparency notices to categorised services. Categorised services will need to respond to these transparency notices once finalised.
For more details on upcoming duties and consultations, see our important dates page.
Requests for information
As a reminder, we can request information from a wide range of organisations to help us fulfil our functions as online safety regulator. You can find a list of our planned information requests, the services affect and timings in our previous industry bulletin.