2 August 2022

Crossing the line: Seven in ten Premier League footballers face Twitter abuse

  • Ofcom discloses machine-learning analysis of 2.3 million tweets in the first half of last season
  • Nearly 60,000 abusive posts sent in the period, affecting seven in ten Premier League players
  • Just twelve players receive half of all abuse – each receiving an average 15 abusive tweets daily
  • Ofcom event hears from Gary Lineker and others about this serious problem in the national game

As the new season warms up for kick-off, Ofcom reveals the scale of personal attacks suffered by Premier League footballers every day on Twitter, and sets out what must be done collectively to tackle the issue.

Ofcom, which is preparing to regulate tech giants under new online safety laws, teamed up with The Alan Turing Institute to analyse more than 2.3 million tweets directed at Premier League footballers over the first five months of the 2021/22 season.[1]

The study created a new machine-learning technology that can automatically assess whether tweets are abusive.[2] A team of experts also manually reviewed a random sample of 3,000 tweets.[3]

What we found

  • The vast majority of fans use social media responsibly. Of the manually-reviewed random sample of 3,000 tweets, 57% were positive towards players, 27% were neutral and 12.5% were critical. However, the remaining 3.5% were abusive. Similarly, of the 2.3 million tweets analysed with the machine-learning tool, 2.6% contained abuse. [4]
  • Hundreds of abusive tweets are sent to footballers every day. While the proportion of abusive tweets might be low, this still amounts to nearly 60,000 abusive tweets directed towards Premier League players in just the first half of the season – an average of 362 every day, equivalent to one every four minutes. Around one in twelve personal attacks (8.6%) targeted a victim’s protected characteristic, such as their race or gender.
  • Seven in every ten Premier League players are targeted. Over the period, 68% of players (418 out of 618) received at least one abusive tweet, and one in fourteen (7%) received abuse every day.
  • A handful of players face a barrage of abuse. We recorded which footballers were targeted, and found that half of all abuse towards Premier Leaguers is directed at twelve particular players. These players each received an average of 15 abusive tweets every day.
2.3 million tweets analysed  7 in 10 players have been targeted by abusive tweets  Abusive tweet sent every 4 minutes  50% of all abuse is directed at just 2% of players  1 in 14 targeted by abuse every day

We also asked the public about their experiences of players being targeted online through a separate poll. More than a quarter of teens and adults who go online (27%) saw online abuse directed at a footballer last season. This increases to more than a third of fans who follow football (37%) – and is higher still among fans of the women’s game (42%).

Among those who came across abuse, more than half (51%) said they found the content extremely offensive, but a significant proportion didn't take any action in response (30%). Only around one in every four (26%) used the flagging and reporting tools to alert the abusive content to the platform, or marked the content as junk.

Ofcom is holding an event today (2 August) to discuss these findings. Hosted by broadcast journalist and BT Sport presenter, Jules Breach, the event will hear from:

  • presenter and former England player Gary Lineker;
  • Manchester United player Aoife Mannion;
  • Professional Footballers' Association Chief Executive Maheta Molango; and
  • Kick It Out Chair Sanjay Bhandari.

What needs to be done

These findings shed light on a dark side to the beautiful game. Online abuse has no place in sport, nor in wider society, and tackling it requires a team effort.

Social media firms needn’t wait for new laws to make their sites and apps safer for users. When we become the regulator for online safety, tech companies will have to be really open about the steps they’re taking to protect users. We will expect them to design their services with safety in mind.

Supporters can also play a positive role in protecting the game they love. Our research shows the vast majority of online fans behave responsibly, and as the new season kicks off we’re asking them to report unacceptable, abusive posts whenever they see them.

Kevin Bakhurst, Ofcom’s Group Director for Broadcasting and Online Content

These stark findings uncover the extent to which footballers are subjected to vile abuse across social media. Prominent players receive messages from thousands of accounts daily on some platforms, and it wouldn’t have been possible to find all the abuse without these innovative AI techniques.

While tackling online abuse is difficult, we can’t leave it unchallenged. More must be done to stop the worst forms of content to ensure that players can do their job without being subjected to abuse.

Dr Bertie Vidgen, lead author of the report and Head of Online Safety at The Alan Turing Institute

What will online safety laws mean?

The UK is set to introduce new laws aimed at making online users safer, while preserving freedom of expression. The Online Safety Bill will introduce rules for sites and apps such as social media, search engines and messaging platforms – as well as other services that people use to share content online.

The Bill does not give Ofcom a role in handling complaints about individual pieces of content. The Government recognises – and we agree – that the sheer volume of online content would make that impractical. Rather than focusing on the symptoms of online harm, we will tackle the causes – by ensuring companies design their services with safety in mind from the start. We will examine whether companies are doing enough to protect their users from illegal content, as well as content that is harmful to children.

Notes to editors

  1. From the start of the 2021/2022 season (13 August 2021) to the winter break (24 January 2022).
  2. The Alan Turing Institute is the UK’s national institute for data science and artificial intelligence. The Institute is named in honour of Alan Turing, whose pioneering work in theoretical and applied mathematics, engineering and computing is considered to have laid the foundations for modern-day data science and artificial intelligence. The Institute’s goals are to undertake world-class research in data science and artificial intelligence, apply its research to real-world problems, driving economic impact and societal good, lead the training of a new generation of scientists, and shape the public conversation around data and algorithms. Part of The Alan Turing Institute’s Public Policy Programme, the Online Safety Team provides objective, evidence-driven insight into online safety, supporting the work of policymakers and regulators, informing civic discourse and extending academic knowledge. They are working to tackle online hate, harassment, extremism and mis/disinformation. The AI model used to identify the abusive tweets was developed as part of The Alan Turing Institute’s Online Harms Observatory, led by their Online Safety Team.
  3. Online abuse is a problem across platforms, and this research is not intended as a reflection, or commentary, on Twitter’s trust and safety practices. We chose Twitter for this study because it is a widely-used platform on which many Premier League football players are active; because several players have reported being abused on Twitter before, such as during the Euro 2020 finals; and because, unlike most platforms, Twitter makes data available for academic research.
  4. Definitions of positive, neutral, critical and abusive tweets:
    • Abusive: The tweet threatens, insults, derogates, dehumanises, mocks or belittles a player. This can be implicit or explicit, and includes attacks against their identity. We include use of slurs, negative stereotypes and excessive use of profanities.
    • Critical: The tweet makes a substantive criticism of a player’s actions, either on their pitch or off. It includes critiquing their skills, their attitude and their values. Often, criticism is less aggressive and emotive.
    • Positive: The tweet supports, praises or encourages the player. It includes expressing admiration for a player and their performance, and wishing them well.
    • Neutral: The tweet does not fall into the other categories. It does not express a clear stance. neutral statements include unemotive factual statements and descriptions of events.

Contact the media team

If you are a journalist wishing to contact Ofcom's media team:

Call: +44 (0) 300 123 1795 (journalists only)

Send us your enquiry (journalists only)

If you are a member of the public wanting advice or to complain to Ofcom:

Related content