29 November 2023

How the Online Safety Act will help to protect women and girls

Online interactions play a major role in our daily lives. While most people have positive experiences online, for many women and girls life online can be an extension of harmful gender dynamics that exist in wider society.

Our research shows women are more negatively affected by hateful and trolling content, and feel less able to have a voice and share their opinions online. Women and girls are also disproportionately affected by certain kinds of online harms, such as intimate image abuse, cyberflashing, and controlling or coercive behaviour.

These harms can include a range of behaviours that threaten, humiliate, monitor, or silence women and girls, and can extend beyond the online space. Importantly, different groups of women and girls are affected differently by online harm. Age, sexuality, gender identity, race and ethnicity, along with many other factors, influence women and girls’ experiences online. For example, women and girls from minority ethnic backgrounds are more at risk of experiencing online harm, and are twice as likely to believe that the risks of being online outweigh the benefits.

What we’re doing to address harms to women and girls online

Ofcom takes the protection of women and girls online very seriously. Under the Online Safety Act, online services such as social media and search services will have duties to protect users’ safety and their rights – understanding and addressing the experiences of women and girls online is central to this.

Earlier this month, we published our first consultation on our approach to online regulation. The consultation focuses on harm arising from illegal content. It includes our draft guidance on how service providers should assess the risks on their service for harms including intimate image abuse and coercive and controlling behaviour, and our draft codes of practice recommending safety measuresthey can put in place. We will also be publishing future consultations focusing on how we expect services to protect children online and on transparency. The measures set out in these consultations will contribute to tackling online gendered violence and abuse, but we recognise there is more work to do.

Once the major consultations have been completed, in the first half of 2025 we intend to publish draft guidance for online services focusing on content and activity which disproportionately affects women and girls. The draft guidance will contain advice and best practice for services on how they can tackle online gendered harms. In developing this guidance, we will work with experts across civil society, industry, government agencies and academia and will bring the perspectives of survivors and victims of online gendered violence and abuse.

In addition, we plan to build on our existing media literacy work to help make people more aware of, and help reduce, harmful gendered content and activity online. We will do this by partnering with organisations to support training local youth workers and educators on online misogyny and working with boys and men to support positive gender relations.

Our work across the consultations, dedicated guidance and media literacy is aimed at supporting regulated services in better protecting women and girls and enabling them to live a safer life online.

More information about our consultations

Our first consultation includes our draft register of risks and risk profiles on illegal harm, which provides information for services on how certain functionalities or features of their services might give rise to risk. This includes how user demographics (including gender) influence risk of different kinds of illegal harm.

The consultation also includes our draft codes of practice,which set out the measures we are proposing to recommend for services to protect users from illegal harm. We expect these measures – including those on enhanced user control and reporting and complaints – to play a part in protecting women and girls, as well as other vulnerable users. Finally, our draft Illegal Content Judgments Guidance helps services to recognise what illegal content might look like, including illegal harms which disproportionately affect women and girls.

Next year, we will be publishing our second consultation which will focus on the protection of children from features and content that might be harmful to them, such as violent and abusive material. We’re also looking ahead to our user empowerment consultation, which will set out our expectations for services on how users can control their experiences online.

Related content