Mae'r cynnwys hwn ar gael yn Saesneg yn unig.

BitChute: compliance assurances to protect users from videos containing harmful material

3 October 2023

Ar gau

Ymchwiliad i BitChute Limited
Achos wedi’i agor 3 October 2023
Case closed 3 October 2023
Crynodeb

Compliance assurances from BitChute regarding its obligations under Part 4B of the Communications Act 2003: Improvements to the measures BitChute has in place to protect users from videos containing harmful material.

Ofcom's role is to ensure video-sharing platforms (VSPs) based in the UK have appropriate systems and processes in place as required under Part 4B of the Act to effectively protect their users from harmful video content in scope of the VSP regime[1].

In May 2022, a far-right extremist carried out a racially motivated attack in Buffalo, New York. Ofcom conducted analysis of the measures in place to protect users from harmful material on several VSPs, including BitChute, in light of this incident.

Our analysis of BitChute's platform raised concerns that some of its measures were not effectively protecting users from encountering videos related to terrorism and other harmful material prohibited under the VSP regime.

Following a period of close engagement with BitChute to discuss its compliance with its obligations under Part 4B of the Communications Act 2003, it has made some important changes and also committed to further improvements to protect users from harmful material.

Darpariaeth(au) cyfreithiol perthnasol

Part 4B Communications Act 2003

Background

On 14 May 2022, a far-right extremist carried out a racially motivated attack in Buffalo, New York, killing ten people and wounding three others. The attacker livestreamed the shooting online, and versions of the footage were distributed on multiple online services, including BitChute and other UK-based VSPs that we currently regulate. This resulted in UK users being potentially exposed to harmful material related to terrorism and material likely to incite violence and hatred.

Ofcom’s role is to ensure VSPs have appropriate systems and processes in place as required under Part 4B of the Act to effectively protect their users from harmful video content in scope of the VSP regime. Our approach to securing compliance focuses on oversight, accountability, and transparency, working with the industry where possible to drive improvements, as well as taking formal enforcement action where appropriate.

Our concerns

In the weeks following the Buffalo attack, we engaged with relevant VSPs and the wider industry to learn more about how platforms can set up internal systems and processes to prevent the livestreaming of such attacks and protect users from the sharing of associated video content. In October 2022, Ofcom published a report on the Buffalo attack that explored how footage of the attack, and related material, came to be disseminated online and to understand the implications of this for platforms’ efforts to keep people safe online.

Our analysis raised concerns that BitChute’s reporting and flagging measures were not effectively protecting users from encountering videos related to terrorism and other harmful material prohibited under the VSP regime. In particular, Ofcom was concerned that BitChute’s reporting function was not open to non-registered users, and that the capacity and coverage of BitChute’s content moderation team was insufficient to enable it to respond promptly to reports of harmful content.

BitChute's commitments

In response to our concerns, BitChute has made some important changes to its reporting and flagging mechanisms and to the content moderation processes which underpin these, as well as committing to further changes.

1. Coverage and capacity of content moderation

In our 2022 VSP Report, published in October, we found that all VSPs, including BitChute, have adequate terms and conditions that prohibit material that would come within the scope of laws relating to terrorism, racism, and xenophobia, as well as material likely to incite violence or hatred (content we refer to collectively as ‘hate and terror content’).

However, the Buffalo attack exposed key deficiencies in BitChute’s ability to effectively enforce its terms and conditions relating to hate and terror content: footage of the shooting was easily accessible on the platform in the days after the attack, and we learnt that the platform’s content moderation team was modest in size and limited to certain working hours. This restricted BitChute’s ability to respond quickly to reports that footage was on the platform following the attack.

BitChute has committed to triple the capacity of its moderation team by taking on more human moderators. It is also extending the coverage of its moderation team by increasing the number of hours that moderators are available to review reports and has committed to having a safety team operational 24/7 in autumn 2023.

2. User reporting and flagging mechanisms

Prior to the Buffalo attack, BitChute had reporting and flagging mechanisms in place to allow users to report potentially harmful content. However, on-platform flagging was only available to users who had a registered BitChute account. While all users (registered or unregistered) were able to report content by sending an email to BitChute, we were concerned that requiring non-registered users to email the platform, rather than click a reporting button next to the video, introduces a layer of friction to the reporting process that could disincentivise the user from making a report and increase the time taken to respond to reports.

As a result of our remediation work, BitChute has changed the design of its platform to allow non-registered users to directly report potentially harmful content. It has also updated its user-facing guidelines to set out more clearly what registered and non-registered users can expect from the flagging and reporting process.

3. Measuring effectiveness

BitChute has also committed to collecting additional metrics to measure the impact of changes made to its systems and processes, including the volume of content review reports raised each day and average response time in minutes for content reports. These metrics will help BitChute and Ofcom to evaluate the effectiveness of the platform’s measures more easily.

We have also encouraged BitChute to implement additional reports on risk metrics, that measure the risk of harmful material being encountered on the platform and process metrics, that measure the effectiveness of BitChute’s moderation systems.

Our response

Taking into account BitChute’s willingness to make timely improvements to its systems and processes to directly address the concerns we identified following the Buffalo incident, and our desire to work with industry to secure changes that protect users[2], we have decided not to open an investigation against BitChute into its compliance with its duties under Part 4B of the Communications Act 2003 at this time[3]. We will, however, closely monitor the implementation of the proposed changes and the impact these changes have on user safety.

We also note that, on 14 June 2023, BitChute became a member of the Global Internet Forum to Counter Terrorism (GIFCT). GIFCT is a cross-industry initiative designed to prevent terrorists and violent extremists from exploiting digital platforms. Whilst we do not consider this an indicator of compliance, it is an encouraging step – GIFCT has rigorous standards for membership, including demonstrating "a desire to explore new technical solutions to counter terrorist and violent extremist activity online" and "support for expanding the capacity of civil society organisations to challenge terrorism and violent extremism".

While we welcome BitChute's commitments to further improvements and measuring their effectiveness, we are aware of reports – some of which have been communicated to us directly – alleging that content likely to incite violence and hatred continues to be uploaded to BitChute, can be accessed easily, and may pose significant risks to users.

It is important to note that the VSP regime is a systems and processes regime, meaning the presence of harmful video content on a service is not in itself a breach of the rules. Accordingly, Ofcom’s focus is to drive improvements to platforms’ systems and processes to minimise the risks of users encountering harmful videos online in the first place.

However, such content can be indicative of an underlying issue with the user protections in place, and we will therefore continue to monitor BitChute closely to assess whether the changes it has made to its user reporting and content moderation systems result in tangible improvements to user safety. If we find that, despite BitChute’s assurances and improvements, users are not being adequately protected from the categories of harmful material covered by the VSP regime, we will not hesitate to take further action, including formal enforcement action if necessary.

Footnotes:

  1. On 19 September 2023, the Online Safety Bill received Parliamentary approval and will soon achieve Royal Assent, meaning it will become law. However, the VSP regime will remain in force for a transitional period, meaning that all pre-existing, UK-established VSPs will remain subject to the obligations in Part 4B of the Communications Act 2003 until it is repealed through future secondary legislation. Given the two regimes’ shared objective to improve user safety by requiring services to protect users through the adoption of appropriate systems and processes, Ofcom considers that compliance with the VSP regime will assist services in preparing for compliance with the online safety regime as sets out in the Online Safety Bill. However, our actions and decisions under the VSP regime do not preclude or set direction for any policy or enforcement decisions we may make under the Online Safety regime once it is in force.
  2. See our plan and approach to regulating VSPs.
  3. Our Enforcement Guidelines (PDF, 754.9 KB) explain why and how Ofcom opens enforcement investigations. We may choose to take steps to resolve the issue without formal enforcement action where, for example, we are satisfied that the relevant business has taken, or has offered assurances that it will take, appropriate steps to address any concerns we have identified.

Cyswllt Y tîm gorfodi (enforcement@ofcom.org.uk)
Cyfeirnod yr achos