14 December 2023

How are TikTok, Snap and Twitch protecting children from harmful videos?

A new report from Ofcom, published today, takes stock of how popular video-sharing platforms are protecting children from accessing potentially harmful videos.

Under the video-sharing platform (VSP) regime, UK-based services must put in place measures to protect children from encountering videos that may impair their physical, mental, or moral development.

Using our formal information gathering powers, we’ve looked at the steps being taken by TikTok, Snap and Twitch – three of the most popular regulated video sharing services for children – to meet these requirements. We found that all three take steps to prevent children encountering harmful videos, however, children can still sometimes face harm while using these platforms.

What we found

Our report finds that:

  • TikTok, Twitch and Snap all allow sign-ups from children aged 13 and over. They rely on users declaring their true age when signing up. This means it is easy for users to gain access by entering a false age.
  • All three enforce age restrictions using a range of methods to identify potential underage accounts, including artificial intelligence (AI) technologies and human moderators. However, the effectiveness of these measures is yet to be established. The report includes the number of underage accounts taken down by each platform.
  • Users need an account to access most of Snap’s or TikTok’s content. Twitch, however, is open access, which means anyone of any age can access most of its videos, regardless of whether they have an account. This includes any video where a mature label has been applied.
  • The three platforms adopt different approaches to classifying and labelling content as unsuitable for under-18s. TikTok classifies content based on certain mature themes, Snap ranks content on Discover and Spotlight to make sure it is age appropriate, and Twitch has introduced more detailed content labelling. Without robust corresponding access controls and safety measures, however, children still risk encountering harmful content. For example, all Twitch users – logged in or not – can view age-inappropriate content by simply dismissing the warning label.
  • TikTok and Snap both have parental controls designed to give parents and carers some oversight of their children’s online activity. By contrast, Twitch’s terms and conditions require parents to supervise children in real time while they are using the service.

The protection of children – including ensuring that under-18s have an age-appropriate online experience – is central to the Online Safety Act. In line with our implementation roadmap, we will be consulting on the broad child safety measures under the Act in spring 2024.

We expect all services regulated under the VSP regime to also be in scope of the online safety regime. However, only once it is fully repealed by the UK Government will services have to comply with all their broader Online Safety Act duties.

In the meantime, we will continue to work with regulated VSPs to drive safety improvements in the interests of their users. This will include dedicated supervisory engagement, further transparency reporting, or – where appropriate – taking enforcement action.

New investigation into TikTok’s compliance with a statutory information request

It is crucial that Ofcom can gather accurate information about measures put in place by regulated VSPs to protect users. This includes understanding systems, such as parental controls, to help ensure that children are protected from restricted material.

We use such information to monitor the measures taken by platforms, assess compliance, and publish public reports.

We asked TikTok for information about its parental control system, Family Pairing, and we have reason to believe that the information it provided was inaccurate.

So we are today also opening an investigation into whether TikTok has failed to comply with its duties to provide information in response to a formal request for information, in such a manner as specified by Ofcom.

We expect to provide an update on this investigation early next year.

What to do if we request information from you

To inform our work as a regulator, we sometimes use our formal powers to request information from individuals and businesses. When we do, we expect to receive clear, complete and accurate information.

If you get a request from us, here's what you need to do.

Related content