6 July 2022

New online safety rules – what do they mean, and what is Ofcom’s role?

Today we’ve set out our plans for how we will implement new online safety rules, which we expect will come into force next year giving Ofcom new powers in this area.

Here, Mark Bunting, Ofcom director, online policy and video sharing platforms, explains the new regime and Ofcom’s role within it.

The UK is set to introduce comprehensive new laws aimed at making online users safer, while preserving freedom of expression. The Online Safety Bill will introduce rules for sites and apps such as social media, search engines and messaging platforms – as well as other services that people use to share content online.

In the first 100 days of our new powers coming into effect, we will focus on protecting users from illegal content harms, including child sexual exploitation and abuse, and terrorist content.

What the new laws will mean

This is novel regulation and so it is also important to understand what the Online Safety Bill does – and does not – require.

The focus of the Bill is not on Ofcom moderating individual pieces of content, but on tech companies assessing risks of harm to their users and putting in place systems and processes to keep them safer online.

As well as setting codes of practice and giving guidance on compliance, Ofcom will have powers to demand information from tech companies on how they deal with harms and to take enforcement action if they fail to comply with their duties. The Bill will also ensure that the biggest services, and those that pose heightened risks, are more transparent and can be held to account for their actions.

It is also important to recognise that:

  • Ofcom will not censor online content. The Bill does not give Ofcom powers to moderate or respond to individuals’ complaints about individual pieces of content. The Government recognises – and we agree – that the sheer volume of online content would make that impractical. Rather than focusing on the symptoms of online harm, we will tackle the causes by ensuring companies design their services with safety in mind from the start.
  • Tech firms must minimise harm, within reason. We will examine whether companies are doing enough to protect their users from illegal content and content that is harmful to children, while recognising that no service in which users freely communicate and share content can be entirely risk-free. Under the draft laws, the duties placed on in-scope online services are limited by what is proportionate and technically feasible.
  • Services can host content that is legal but harmful to adults, but must have clear service terms. Under the Bill, services with the highest reach – known as ‘Category 1 services’ – must assess risks associated with certain types of legal content that may be harmful to adults. They must have clear terms of service or community guidelines explaining how they handle it, and apply these consistently. They must also provide tools that empower users to reduce their likelihood of encountering this content. But they will not be required to block or remove legal content unless they choose to.

See also...