This glossary defines the terms we have used throughout the ‘check how to comply with the protection of children rules’ tool which helps providers of user-to-user and search services to understand how to comply with the children’s risk assessment duties.
A
Content which is abusive and which targets any of the following characteristics:
- race
- religion
- sex
- sexual orientation
- disability
- gender reassignment
The Online Safety Act 2023.
Technical mechanism which prevents users who have not been age assured, or having been age assured, did not meet the requirements of the age assurance process, from accessing a service (or part of it) or certain content.
A collective term for age verification and age estimation.
A form of age assurance designed to estimate the age or age range of the user.
A form of age assurance designed to verify the exact age of the user.
User-to-user service functionality allowing users to create a user profile where their identity is unknown to an extent. This includes instances where a user's identity is unknown to other users; for example, through the use of aliases (pseudonymity). It also includes where a user’s identity may be unknown to a service, for example, services that do not require users to register by creating an account.
Feature that allows audiovisual content to continue playing without input from the user
B
A user-to-user functionality where: a) blocked users cannot send direct messages to the blocking user and vice versa; b) the blocking user will not encounter any content posted by blocked users on the service and vice versa; c) the blocking user and blocked user, if they were connected, will no longer be connected.
Any action that means that the content cannot be clearly seen by users. For example, this may be done by a greyscale overlaying an image, accompanied by a content warning.
Content targeted against a person which conveys a serious threat, is humiliating or degrading, or forms part of a campaign of mistreatment.
The way in which a business operates to achieve its goals. For the purposes of the Children’s Register of Risks, this includes a service’s revenue model and growth strategy.
C
The characteristics of a service include its functionalities, user base, business model, governance and other systems and processes. This is set out in section 98(11) of the Act.
The Children’s Register of Risks is a document setting out the findings of Ofcom’s own risk assessment for content harmful to children. It sets out detailed evidence on risk factors that we have used to inform the Children’s Risk Profiles.
The Children’s Risk Assessment Guidance is a guidance document to help providers of services likely to be accessed by children to comply with the children’s risk assessment duties, as set out in the Act.
Children’s Risk Profiles are the lists of different online safety risk factors published by Ofcom. They represent a selection of specific characteristics of online services such as user base, business models and functionalities that our Children’s Register of Risks indicates are most strongly linked to a risk of content harmful to children. They are not the same as the Risk Profiles for Illegal Harms.
The safety duties protecting children in sections 12 and 29 of the Act.
Under the Online Safety Act 2023 (the ‘Act’), Ofcom is required to prepare and issue Codes of Practice for providers of Part 3 services, describing measures recommended for the purpose of compliance with specified duties imposed on those providers by the Act.
User-to-user service functionality that allows users to reply to content, or post content in response to another piece of content posted on open channels of communication, visually accessible directly from the original content without navigating away from that content.
The size of the service in terms of capacity, the stage of service maturity and rate of growth in relation to users or revenue.
Anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description.
A means of restricting certain user’s access to a particular piece of content on a service.
Content that is harmful to children is primary priority content, priority content or content which is an identified kind of non-designated content.
An algorithmic system which determines the relative ranking of an identified pool of content (that includes regulated user generated content) from multiple users on content feeds. Content is recommended based on factors that it is programmed to account for, such as popularity of content, characteristics of a user, or predicted engagement. References to content recommender systems do not include a content recommender system employed exclusively in the operation of a search functionality which suggests content to users in direct response to a search query, product recommender systems or network recommender systems.
User-to-user service functionality allowing users to assign a keyword or term to content that is shared.
Content which incites hatred against people:
- of a particular race, religion, sex or sexual orientation
- who have a disability
- who have the characteristic of gender reassignment
Refers to information provided by a search service in search results that typically contains the contact details of helplines and/or links to supportive information provided by a reputable organisation, to assist users experiencing a mental health crisis.
A category of illegal CSEA content, including in particular indecent or prohibited images of children (including still and animated images, and videos, and including photographs, pseudo-photographs and nonphotographic images such as drawings). CSAM also includes other material that includes advice about grooming or abusing a child sexually or which is an obscene article encouraging the commission of other child sexual exploitation and abuse offences; content which links or otherwise directs users to such material; or content which advertises the distribution or showing of CSAM.
Refers to offences specified in Schedule 6 of the Act, including offences related to CSAM and grooming. CSEA includes but is not limited to causing or enticing a child or young person to take part in sexual activities, sexual communication with a child and the possession or distribution of indecent images.
Harm that occurs when harmful content (primary priority content, priority content or non-designated content) is repeatedly encountered by a child, and/or when a child encounters harmful combinations of content. These combinations of content include encountering different types of harmful content (primary priority content, priority content or non-designated content), or a type of harmful content (primary priority content, priority content or non-designated content) alongside a kind of content that increases the risk of harm from primary priority content, priority content or non-designated content.
D
Content which encourages, promotes, or provides instructions for a challenge or stunt highly likely to result in serious injury to the person who does it or to someone else
User-to-user service functionality allowing a user to send and receive a message to one recipient at a time, and which can only be immediately viewed by that specific recipient.
A user-to-user service type describing general services that generally allow users to send or post messages that can be read by the public or an open group of people.
Action taken by a search service which involves altering the ranking algorithm such that a particular piece of search content appears lower in the search results and is therefore less discoverable to users
Search service type describing a subsection of general search services. Downstream general search services provide access to content from across the web, but they are distinct in that they obtain or supplement their search index from other general search services.
E
Content which encourages, promotes, or provides instructions for an eating disorder or behaviours associated with an eating disorder.
User-to-user service functionality that allows users to send messages that are automatically deleted after they are viewed by the recipient, or after a prescribed period of time has elapsed.
F
For user-to-user services, functionalities include features that enable interaction between users. Functionalities for search services include features that enable users to search websites or databases, as well as features that make suggestions relating to users’ search requests. This is set out in sections 233(2-3) of the Act.
G
User-to-user service type describing services that allow users to interact within partially or fully simulated virtual environments.
Search service type describing services that enables users to search the internet and which derives search results from an underlying search index (developed by either the service or a third party).
AI models that can create text, images, audio and videos, typically in response to a user prompt.
This term refers to the structures that ensure the adequate oversight, accountability and transparency of decisions within a service which affect user safety. [added from glossary] This is in relation to organisational structure as well as product and content governance.
User-to-user service functionality allowing users to send and receive messages through a closed channel of communication to more than one recipient at a time.
H
Harm means physical or psychological harm.
As set out in the Act, harm can occur from isolated incidents of exposure, or from cumulative exposure. Cumulative harm arises when:
- harmful content (primary priority content, priority content or non-designated content) is repeatedly encountered by a child
- a child encounters harmful combinations of content - including encountering different types of harmful content, or a type of harmful content alongside a kind of content that increases the risk of harm from harmful content.
Harm can include circumstances of indirect harm, in which a group or individual are harmed, or the likelihood of harm is increased, as a consequence of another child seeing harmful content, which then affects their behaviours towards others.
Content which encourages a person to ingest, inject, inhale or in any other way self-administer a physically harmful substance or a substance in such a quantity as to be physically harmful
An age assurance process that is of such a kind and implemented in such a way that it is highly effective at correctly determining whether or not a particular user is a child.
Functionality providing direct access to another piece of data by clicking or tapping on specific content present on the service.
I
A design pattern in which a page loads content as a user scrolls down, allowing them to discover and view large amounts of content with no distinct end. This design pattern is typically associated with content recommender system where large volumes of personalised content is curated.
More detailed versions of external content policies which set out rules, standards or guidelines, including around what content is allowed and what is not, as well as providing a framework for how policies should be operationalised and enforced.
L
A service with more than 7 million monthly active UK users.
User-to-user service functionality that allows users to simultaneously create and broadcast online streaming media in, or very close to, real time.
A service which the provider has not assessed as being medium or high risk in relation to any kind of content harmful to children in its risk assessment.
M
User-to-user service type describing services that are typically centred around the sending and receiving of messages that can only be viewed or read by a specific recipient or group of people.
When a service provider reviews and assesses content to determine whether it is harmful to children or not, or whether it is in breach of the terms of service or publicly available statement of the service, and takes appropriate action based on that determination We use ‘content moderation’ when referring to user-to-user services, and ‘search moderation’ when referring to search services.
A service is multi-risk if the provider has assessed the service as having medium or high risk of two or more specific kinds of content that is harmful to children.
Muting refers to a feature that enables a user to ‘mute’ another user. The muting user will not encounter any content posted by muted users on the service (unless the muting user visits the user profile of the muted user directly). The muted user is not aware that they have been muted and continues to encounter content posted by the muting user.
N
Non-designated content is a category of content that is harmful to children. It consists of content which is not primary priority content or priority content, of a kind which presents a material risk of significant harm to an appreciable number of children in the UK.
P
Refers to a search service that falls within the definition of section 4 of the Act.
A user-to-user service, as defined in section 4 of the Act.
Content of such a nature that it is reasonable to assume that it was produced solely or principally for the purpose of sexual arousal.
For the purposes of the Act, pornographic content that is harmful to children specifically excludes content that consists only of text or consists only of text and is accompanied by identifying content (that may be text or another kind of content which is not itself pornographic), non-pornographic GIFs, emojis or other symbols, or any combination of these.
Services whose principal purpose is the hosting or dissemination of pornographic content and who host user-generated pornographic content. These services are subject to the risk assessment duties and the children’s safety duties. Pornography that is published or displayed by the provider of the service is subject to different duties set out in Part 5 of the Act and Ofcom has published separate guidance for providers subject to these duties.
User-to-user service functionality allowing users to upload and share content on open channels of communication.
An algorithmic functionality embedded in the search field of a search service. It operates by anticipating a user’s search query and suggesting possible related search requests (‘predictive search suggestions’), based on a variety of factors (including a user’s past queries and other user queries, locations, and trends) to help users make more relevant searches.
Primary priority content is a category of content that is harmful to children. It consists of:
- pornographic content
- suicide content
- self-harm content
- eating disorder content
Priority content is a category of content that is harmful to children. It consists of:
- abusive content
- content which incites hatred
- bullying content
- violent content (provides instructions for)
- violent content (humans)
- violent content (animals or fictional creatures)
- dangerous stunts and challenges content
- harmful substances content
Age; disability; gender reassignment; marriage and civil partnership; pregnancy and maternity; race; religion or belief; sex; and sexual orientation.
A statement that search services are required to make available to members of the public in the UK, often detailing various information on how the service operates.
R
User-to-user service functionality allowing users to express a reaction, such as approval or disapproval, of content that is shared by other users, through dedicated features that can be clicked or tapped by users.
The Record-Keeping and Review Guidance is designed to help service providers understand what is expected in relation to keeping written records of risk assessments and the measures taken to comply with the relevant duties and reviewing compliance with the relevant duties.
User-to-user service functionality which allows users to re-share content that has already been shared by a user.
How a service generates income or revenue.
Identifying and assessing the risk of harm to individuals from illegal content and content harmful to children, present on a Part 3 regulated service.
A risk factor is a characteristic associated with the risk of one or more kinds of harm.
The possibility of individuals encountering harm on a Part 3 service.
S
Content that may be encountered in or via search results of a search service. It does not include paid-for advertisements, news publisher content, or content that reproduces, links to, or is a recording of, news publisher content.
Includes a service or functionality which enables a person to search some websites or databases but does not include a service which enables a person to search just one website database.
A collection of URLs that are obtained by deploying crawlers to find content across the web, which is subsequently stored and organised.
In relation to a search service, this means content presented to a user of the service by operation of the search engine, in response to a search query made by a user.
An internet service that is, or includes, a search engine.
A process where the user is asked to provide their own age. This could be in the form of providing a date of birth to gain entry to a service or by ticking a box to confirm a user is over a minimum age threshold.
Content which encourages, promotes or provides instructions for an act of deliberate self-injury
A regulated user-to-user or search service.
The design of all the components that shape a user’s end-to-end experience of a service. These components can include the business model or decision-making structures, back-end systems and processes, the user interface, and off-platform interventions.
A characteristic that in general refers to the nature of the service. For example, social media services and messaging services.
User-to-user service type describing services that connect users and enable them to build communities around common interests or connections.
Content which encourages, promotes or provides instructions for suicide
These are actions taken by a service to mitigate the risk of harm arising from content harmful to children. This could include both human and automated moderation processes. This is set out in section 236 of the Act.
T
All documents comprising the contract for use of the service (or of part of it) by UK users.
U
Demographic make-up of the user base, including selected characteristics, intersectional dynamics and other relevant demographic factors.
Your user base is the information you hold about the users of your service. You may refer to them in various ways such as customers, clients, subscribers, visitors or similar terms. A user doesn’t need to be registered with a service to be considered a user of that service. This is set out in section 227 of the Act.
Functionality type that comprises user-to-user service functionalities which allow users to communicate with one another, either synchronously or asynchronously. Includes communication across open and closed channels.
User-to-user service functionality that allows users to follow or subscribe to other users. Users must sometimes be connected in order to view all or some of the content that each user shares.
User-to-user service functionality allowing users to create online spaces that are often devoted to sharing content on a particular topic. User groups are generally closed to the public and require an invitation or approval from existing members to gain access. However, in some cases they may be open to the public.
Functionality type that comprises user-to-user service functionalities which allow users to identify themselves to other users.
Functionality type that comprises user-to-user service functionalities which allow users to find or encounter each other and establish contact.
User-to-user service functionality that is associated with a user account, that represents a collection of information shared by a user which may be viewed by other users of the service. This can include information such as username, biography, profile picture, etc., as well as user-generated content generated, shared or uploaded by the user using the relevant account.
User reports are a specific type of complaint about content, submitted through a reporting tool.
Content (a) that is (i) generated directly on the service by a user of the service, or (ii) uploaded to or shared on the service by a user of the service; and (b) which may be encountered by another user, or other users, of the service by means of the service.
User-to-user service functionality allowing users to search for user-generated content by means of a user-to-user service.
An internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.
V
Search service type describing services that enable users to search for specific topics, or products or services (such as flights or hotels) offered by third-party operators. Unlike general search services, they do not return search results based on an underlying search index. Rather, they may use an API or equivalent technical means to directly query selected websites or databases with which they have a contract, and to return search results to users.
Content which depicts real or realistic serious violence against an animal or fictional creature or depicts the real or realistic serious injury of an animal or fictional creature in graphic detail.
Content which depicts real or realistic serious violence against a person or depicts the real or realistic serious injury of a person in graphic detail.
Content which encourages, promotes or provides instructions for an act of serious violence against a person.