Implementing the Online Safety Act: Protecting children from online pornography
- Ofcom sets out guidance on highly effective age checks to stop children accessing online porn services
- Methods could include photo ID matching, facial age estimation and credit card checks
- Services must take care to safeguard users’ privacy and adults' rights to access legal pornography
Children are set to be protected from accessing online pornography under new age-check guidance proposed by Ofcom today to help services to comply with online safety laws.
Latest research shows that the average age at which children first see online pornography is 13 – although nearly a quarter come across it by age 11 (27%), and one in ten as young as 9 (10%). Additionally, nearly 8 in 10 youngsters (79%) have encountered violent pornography depicting coercive, degrading or pain-inducing sex acts before turning 18.
Under the Online Safety Act, sites and apps that display or publish pornographic content must ensure that children are not normally able to encounter pornography on their service.
To do this, they must introduce 'age assurance' – through age verification, age estimation or a combination of both – which is ‘highly effective’ at correctly determining whether a user is a child or not. Effective access controls should prevent children from encountering pornographic content on that service.
Highly effective methods of age assurance
Ofcom’s job is to produce guidance to help online pornography services to meet their legal responsibilities, and to hold them to account if they don’t. Our draft guidance sets strict criteria which age checks must meet to be considered highly effective; they should be technically accurate, robust, reliable and fair.
We also expect services to consider the interests of all users when implementing age assurance. That means affording strong protection to children, and taking care that privacy rights are safeguarded and adults can still access legal pornography.
Given the technology underpinning age assurance is likely to develop and improve in future, our guidance includes a non-exhaustive list of methods that we currently consider could be highly effective. These include:
- Open banking. A user can consent to their bank sharing information confirming they are over 18 with the online pornography service. Their full date of birth is not shared.
- Photo identification matching. Users can upload a photo-ID document, such as a driving licence or passport, which is then compared to an image of the user at the point of uploading to verify that they are the same person.
- Facial age estimation. The features of a user’s face are analysed to estimate their age.
- Mobile network operator age checks. Some UK mobile providers automatically apply a default content restriction which prevents children from accessing age-restricted websites. Users can remove this restriction by proving to their mobile provider that they are an adult, and this confirmation is then shared with the online pornography service.
- Credit cards checks. In the UK, credit card issuers are obliged to verify that applicants are over 18 before providing them with a credit card. A user can provide their credit card details to the online pornography service, after which a payment processor sends a request to check the card is valid to the issuing bank. Approval by the bank can be taken as evidence that the user is over 18.
- Digital identity wallets. Using a variety of methods, including those listed above users can securely store their age in a digital format, which the user can then share with the online pornography service.
Weaker age-checks won’t be enough
We are also clear that certain approaches to age assurance won’t meet the standards set under our draft guidance. These weaker methods include:
- self-declaration of age;
- online payment methods which don’t require a person to be 18 (Debit, Solo, or Electron cards, for example); and
- general terms, disclaimers or warnings.
In addition, we specify that pornographic content must not be visible to users before, or during, the process of completing an age check. Nor should services host or permit content that directs or encourages children to attempt to circumvent age and access controls.
“Pornography is too readily accessible to children online, and the new online safety laws are clear that must change.
“Our practical guidance sets out a range of methods for highly effective age checks. We’re clear that weaker methods – such as allowing users to self-declare their age – won’t meet this standard.
“Regardless of their approach, we expect all services to offer robust protection to children from stumbling across pornography, and also to take care that privacy rights and freedoms for adults to access legal content are safeguarded.”Dame Melanie Dawes, Ofcom’s Chief Executive
Attitudes towards age assurance
The vast majority of people (80% on average – 87% of women and 77% of men) are broadly supportive of age assurance on online pornographic sites as a means of protecting children. Women with children are particularly supportive due to concerns about the potential impact of viewing online pornographic content at a young age.
Among adults who have previously viewed pornography online, their biggest concerns about proving their age to access the content are around data protection (52%) and sharing personal information (42%).
Protecting privacy rights and adults' access to legal content
All age assurance methods are subject to the UK’s privacy laws, including those concerning the processing of personal data. These are overseen and enforced by the Information Commissioner’s Office (ICO), which has assisted us in developing our guidance.
Under the Online Safety Act, online pornography services are required to keep written records explaining how they protect users from a breach of these laws. Our guidance offers practical ways of how they might go about this – including, for example, by conducting a data protection impact assessment, and providing users with privacy information such as how their personal data will be processed, how long it will be retained, and if it will be shared with anyone else.
We also recommend that services should consult the ICO’s guidance  to understand how to comply with the data protection regime, as well as its Opinion on Age Assurance for the Children's Code, which we expect to be revised in January 2024.
To ensure that adults are not unduly prevented from accessing legal content, our draft guidance also sets out important principles that age assurance should be easy to use and work for all users, regardless of their characteristics or whether they are members of a certain group.
Ofcom expects online pornography services to work with us, both as our draft guidance is finalised and beyond, so that they are fully prepared to comply when the time comes. Companies who ultimately fall short will face enforcement action, including possible fines.
We expect to publish our final guidance in early 2025, after which the Government will bring these duties into force.
Notes to editors:
1. ‘A lot of it is actually just abuse’- Young people and pornography Children's Commissioner for England (childrenscommissioner.gov.uk)
2. Part 5 of the Online Safety Act applies to providers who publish or display pornographic content on their online services. The following types of pornographic content are outside the scope of Part 5 of the Online Safety Act:
- user-generated content within the meaning of section 55(3) and (4) of the Act in relation to an internet service;
- text, including text accompanied by a GIF (provided that is not pornographic), an emoji or other symbol;
- paid-for advertisements (as defined in section 236 of the Act);
- content appearing in the search results of a search engine or a combined service; and
- content that appears on an on-demand programme service (ODPS) which is regulated by Ofcom under Part 4A of the Communications Act 2003
Pornographic content on user-to-user and search services will be covered in our Protection of Children consultation which we expect to publish in spring 2024.
3. Our draft guidance also suggests that a ‘challenge age’ could be set. This could mean where the technology estimates the users’ age to be under 25, for example, that user would undergo a second age-check via an alternative method.
We are aware that a wide range of age estimation methods exist. At present, we have only proposed including facial age estimation in our guidance, as we do not have evidence to suggest that other methods of age estimation are currently capable of being highly effective, are sufficiently mature technologies, or are being deployed at scale. We will continue to review this position over time as technologies evolve.
5. ICO, 2023, A guide to the data protection principles [accessed 17 November 2023]; ICO, A guide to lawful basis [accessed 17 November 2023]; and ICO, Individual rights – guidance and resources [accessed 17 November 2023].