Content moderation, technical feasibility and age assurance

Published: 21 May 2025

This page provides additional information for providers of online services using our Protection of Children Codes of Practice and our digital toolkit to help them comply with their duties under the Online Safety Act. This guidance is relevant for you if:

  • you prohibit content harmful to children in your terms of service; and
  • you have been recommended the content moderation measure PCU C2; but
  • it is not technically feasible to take down specific pieces of content.

Measure PCU C2 sets out that services should have a content moderation function that allows for swift action against content harmful to children. The recommended appropriate action depends on the type of content and the specifics of your service. Evidence suggests that a very limited set of user-to-user services in scope of the Act are configured in such a way that it is currently not technically feasible for them to take down content identified as harmful to children. As such, we have created additional routes for compliance to allow these services to meet their obligations to protect children.

If you prohibit all kinds of primary priority content on your service, but it is not currently technically feasible to take down the content, you may be in scope of measure PCU B4, and therefore may need to use highly effective age assurance to target content controls and access controls at children.

If you have assessed your service to be medium or high risk of one or more specific kinds of priority content and you prohibit each of those kinds of content, but it is not currently technically feasible to take down the content, you may be in scope of measure PCU B5, and therefore may need to use highly effective age assurance to target content controls and access controls at children.

Back to top