6 July 2022

Ofcom calls on tech firms to start preparing for regulation now

  • Regulator to kick-start 100-day plan after Bill passes to get online safety regime up and running

Tech firms should start preparing now for new online safety rules, Ofcom says today, as we set out detailed plans for implementing the new laws.

The UK is preparing to become among the first countries in the world to introduce comprehensive new laws aimed at making online users safer, while preserving freedom of expression. The Online Safety Bill will introduce rules for sites and apps such as social media, search engines and messaging platforms – as well as other services that people use to share content online.

Ofcom expects the Online Safety Bill to pass by early 2023 at the latest, with our powers coming into force two months later.[1]

Immediate action once powers kick in

Within the first 100 days of our powers taking effect, Ofcom will focus on getting the ‘first phase’ of the new regulation up and running - protecting users from illegal content harms, including child sexual exploitation and abuse, and terrorist content.[2] We will set out:

  • a draft Code of Practice on illegal content harms explaining how services can comply with their duties to tackle them; and
  • draft guidance on how we expect services to assess the risk of individuals coming across illegal content on their services and associated harms.

To help companies identify and understand the risks their users may face, we will also publish a sector-wide risk assessment. This will include risk profiles for different kinds of services that fall in scope of the regime. We will also consult on our draft enforcement guidelines, transparency reporting and record-keeping guidance.

We will consult publicly on all these documents and expect to finalise them in spring 2024. Within three months, companies must have completed their risk assessments related to illegal content, and be ready to comply with their duties in this area from mid-2024 once the Code of Practice has been laid in Parliament.

We are ready and able to evolve our timelines and plans, should the timing or substance of the Bill change.

Early engagement with high-risk services

As well as expecting tech firms to engage as we consult, we will also identify high-risk services for closer supervision.[3] The companies who run these sites or apps must be ready – as soon as our first set of powers come into force in early 2023 – to explain their existing safety systems to us and, importantly, how they plan to develop them.

Ofcom will expect companies to be open with us about the risks they face and the steps they are taking to address them. We will want to know how they have evaluated those measures, and what more they might consider doing to keep users safe. We will also seek to understand users’ attitudes to those services, and consider evidence from civil-society organisations, researchers and expert bodies.

Where we consider that a platform is not taking appropriate steps to protect users from significant harm, we will be able to use a range of investigation and enforcement powers.

Action following secondary legislation

Some elements of the online safety regime depend on secondary legislation – for example, the definition of priority content that is harmful to children, and priority content that is legal but harmful to adults.[4] So duties in these areas will come into effect later and timings will be subject to change, depending on when secondary legislation passes.

We will move quickly to publish draft Codes of Practice and guidance on these areas shortly after secondary legislation passes. Once again, we will consult publicly on these before finalising them.[5]

We’ll move quickly once the Bill passes to put these ground-breakinglaws into practice. Tech firms must be ready to meet our deadlines and comply with their new duties. That work should start now, and companies needn’t wait for the new laws to make their sites and apps safer for users.

Mark Bunting, Ofcom's Online Safety Policy Director

Maintaining momentum this year

Ofcom’s preparations to take on its new role are continuing apace. Today we are calling for evidence on the ‘first phase’ areas identified for consultation: the risk of harm from illegal content; the tools available to services to manage this risk; child access assessments; and transparency requirements. We would like to hear from companies that are likely to fall within the scope of the regime, as well as other groups and organisations with expertise in this area.

In the immediate months ahead, we will build on work already underway by:

  • ramping up our engagement with tech firms, large and small;
  • publishing our first report on how video-sharing platforms such as TikTok, Snapchat, Twitch and OnlyFans are working to tackle harm; and
  • undertaking and publishing research on the drivers and prevalence of some of the most serious online harms in scope of the Bill, as well as technical research on how these might be mitigated;
  • further developing our skills and operational capabilities, building on the expertise we have already brought in from the technology industry, academia and the third sector; and
  • continuing to work with other regulators through the Digital Regulation Cooperation Forum to ensure a joined-up approach between online safety and other regimes.

What the new laws will mean

This is novel regulation and so it is also important to understand what the Online Safety Bill does – and does not – require.

The focus of the Bill is not on Ofcom moderating individual pieces of content, but on the tech companies assessing risks of harm to their users and putting in place systems and processes to keep them safer online.

As well as setting Codes of Practice and giving guidance on compliance, Ofcom will have powers to demand information from tech companies on how they deal with harms and to take enforcement action when they fail to comply with their duties. The Bill will also ensure the tech companies are more transparent and can be held to account for their actions.

It's also important to recognise that:

  1. Ofcom will not censor online content. The Bill does not give Ofcom powers to moderate or respond to individuals’ complaints about individual pieces of content. The Government recognises – and we agree – that the sheer volume of online content would make that impractical. Rather than focusing on the symptoms of online harm, we will tackle the causes by ensuring companies design their services with safety in mind from the start.
  2. Tech firms must minimise harm, within reason. We will examine whether companies are doing enough to protect their users from illegal content and content that is harmful to children, while recognising that no service in which users freely communicate and share content can be entirely risk-free. Under the draft laws, the duties placed on in-scope online services are limited by what is proportionate and technically feasible.
  3. Services can host content that is legal but harmful to adults, but must have clear service terms. Under the Bill, services with the highest reach – known as ‘Category 1 services’ – must assess risks associated with certain types of legal content that may be harmful to adults. They must have clear terms of service or community guidelines explaining how they handle it, and apply these consistently. They must also provide tools that empower users to reduce their likelihood of encountering this content. But they will not be required to block or remove legal content unless they choose to.

Notes to editors

  1. This plan is based on our current understanding of the Bill as it stands, and the likely timing for passage of legislation (including secondary legislation) under the Bill. At the time of publication, the Bill has passed Committee stage in the House of Commons and is subject to amendment as it passes through the rest of the Parliamentary process. Consequently, the timelines and requirements described are provisional and may change. We will continue to look for opportunities to bring forward implementation as the legislative timetable becomes clearer (including the likely timing of relevant secondary legislation), and will provide a further update on our implementation plans if they change significantly.
  2. All services in scope of the Bill have a duty to protect users from illegal content. They must assess, among other things, the risk of individuals coming across illegal content on their platforms, and how that risk is affected by the design of their service. Tech firms must also establish whether children, in significant numbers, can access any part of their service. Companies must put in place measures to mitigate and manage the risks of illegal content and, if they’re likely to be accessed by children, material which is harmful to children, as well as allowing their users to report content and complain. 

    Providers of pornographic material have a standalone duty in the Bill to ensure that children cannot normally access their services. This duty is separate from the requirement on user-to-user and search services to conduct a children’s access assessment and from the duties on those services which are likely to be accessed by children to take steps to protect children from harmful content – which would include user-generated pornographic content or pornographic content in search results. To ensure consistency in our approach to regulating pornographic content across the board – whether published by users or companies – we are currently expecting to consult on guidance and Codes covering the protection of children from pornographic material together in autumn 2023, after secondary legislation has been passed.
  3. We will identify services that will be the subject of this focused engagement through our sector risk assessment and other relevant information, and notify them in advance of the engagement beginning.
  4. The duties regarding content that is legal but harmful to adults only apply to so-called Category 1 services, the largest and highest-risk services. These services also will be required to produce transparency reports, as will the biggest search services ('Category 2a') and other smaller but potentially risky services ('Category 2b'). The Government anticipates around 30-40 services will be in one of these three special categories, so most services will not be subject to these duties.
  5. Our current expectation is that we will set out draft Codes of Practice and risks guidance on protecting children from legal harms, as well as a sector-wide risk assessment, in autumn 2023. We will consult publicly and expect to finalise them within a year, at which point firms should expect to be ready to comply with these duties. We expect to set out draft Codes of Practice and risks guidance on protecting adults from legal harms in early 2024. Once again, we will consult publicly and expect to finalise them within a year, at which point companies should expect to be ready to comply with these duties.

Related content