24 November 2021

The time has come for strong, external oversight of social media companies

Our Chief Executive, Melanie Dawes, explains why we can’t rely on tech media companies responding to users’ outrage when serious harm takes place. Instead, we need a clear set of rules.

Three years before she composed the first modern algorithm, Ada Lovelace pondered the potential for machines to master games like chess and solitaire. If a computer could achieve such a feat, where might this lead?

“I see nothing but vague and cloudy uncertainty”, she confessed, “yet I discern a very bright light a good way further on”.

For all her visionary brilliance in 1840, not even Lovelace could foresee algorithms becoming the hand that guides people’s travel, shopping and entertainment. More than that, they define our modern experience of being online, fuelling what we see in search results and on social media.

Algorithms have personalised the internet, created new business opportunities, and given ordinary people the power to speak to large audiences. Through their ability to target online advertising, they have also fuelled the rapid rise of trillion-dollar tech giants.

But too often, companies appear to have prioritised growth over the safety of their users. By designing their services to maximise reach, they may have inadvertently promoted harmful content: bullying or harassment, hate speech, self-harm. They may not be quick enough to tackle terrorism or sexual abuse.

Today, when people spend a quarter of their waking day connected, safety matters as much online as it always has in the home, school or workplace. Six in ten connected adults – and eight in ten older children – have had at least one harmful experience online in the past year. Most people support tighter rules.

But at the moment, search and social media sites can choose whether to heed those concerns. If these companies are regulated at all, it is only by outrage. The time has come for strong, external oversight.

So the draft Online Safety Bill, currently being scrutinised by Parliament, is an important piece of law. It means tech firms will have new duties of care to their users, which Ofcom will enforce. We plan to build on our track record of upholding broadcast standards, supporting a range of views and promoting innovation.

Equally, everyone should understand what the new laws will not mean. Ofcom will not be regulating or moderating individual pieces of online content. The Government recognises – and we agree – that the sheer volume would make that impractical.

We won’t act as a censor, prevent robust debate or trample over users’ rights. Free speech is the lifeblood of the internet. It is a foundation of democratic society, at the heart of public life, and a value that Ofcom holds dear.

Instead, our job will be to lift the ‘vague and cloudy uncertainty’ that hovers over search and social media.

As a user, you have no idea how these platforms really work. Why do you see the content you see? What are they doing to protect your children from abuse or harassment? When they design their services, is safety their first priority or just a secondary thought?

When we regulate online safety, Ofcom will demand answers to these questions. We will require companies to assess risk with the user’s perspective in mind. They will need to explain what they are doing to minimise, and quickly remove, illegal content – and to protect children from harm.

We will hold companies to account on how they use algorithms, address complaints and ensure a safe experience for children. The biggest services must also explain how they protect journalistic and democratic content. Today, these decisions are being made behind companies’ doors, with no visibility or accountability.

Ofcom will set codes of practice, and report publicly on platforms’ performance. If we find companies fail in their duties of care, we can levy fines or audit their work.

And as other countries follow with similar laws, we will work closely with our international counterparts. When I met tech leaders from around the world at Lisbon’s Web Summit this month, I saw a collective determination to find global solutions to these challenges.

Ofcom is gearing up for the job, acquiring new skills in areas like data and technology. And we have opened a new technology hub in Manchester to help us attract skills and expertise from across the country and beyond.

We will be ready; and I believe the new laws will make a genuine difference. By shining that very bright light on immensely powerful companies, we can ensure they take proper care of their users and create a safer life online for everyone.

Related content