While Arsenal's season has been marked by consistency on the pitch as they fell just short of glory for a third season running, one off-field matter in particular has united their players and staff.
Kai Havertz and his pregnant wife Sophia were targeted with horrific social media abuse after the Gunners’ penalty shootout defeat by Manchester United in the FA Cup third-round in January.
Havertz had missed a number of presentable chances during the match, before missing a crucial spot-kick in the shootout. On the evening of the match, a distraught Sophia shared two posts on her Instagram story, including one in which an account threatened to ‘slaughter’ her unborn baby.
Arsenal’s response, bolstered by a furious Mikel Arteta, reflected a longstanding effort they have made behind the scenes in tackling online vitriol. They are at the forefront of addressing this issue in football aided by Signify, a British data science company.
They operate globally in sport, including the FIFA World Cup and Olympics, using a Threat Matrix AI service to identify and address online threats. Signify work with several Premier League teams, though Arsenal have been partnered the longest. The link-up started in 2021 and is having an overarching impact.
Mail Sport understands that following the abuse targeted at referee Michael Oliver after his decision to send off Myles Lewis-Skelly against Wolves on January 25 - later rescinded by the FA on appeal - Arsenal offered to help the PGMOL with identifying the perpetrators through their partnership with Signify.
Since Arsenal started working with Signify in 2021, there has been a 90 per cent decrease in online abuse from club members. But how do they do it? We were given a unique look behind the scenes into Signify's inner workings. Here's what we found.
Signify was founded in 2017 to address the rise of online abuse targeting politicians in the wake of Brexit. As MPs faced threats that spilled into real-life violence, the company was hired to track down abusive accounts.
By 2019, the team, consisting of around 20 full-time staff in London and a team of analysts across the world, pivoted to global sport. They are now a major player in this area.
‘Before we launched our Threat Matrix service, solutions were heavily focused on hiding abuse, and sweeping the issue under the carpet,’ a Signify spokesperson told Mail Sport.
Arsenal are said to be leading the way in this space through an orchestrated effort which involves multiple departments within the club; the Havertz case is a key example.
‘Inside the club, the player support networks are really strong so there is a clear holistic approach around player wellbeing,’ said a Signify source. ‘The club are very good in terms of how they use the information we provide and what then to do from the player’s perspective.
‘Arsenal has a joined-up approach to dealing with online abuse and threat. Sometimes we will work with clients and it might be one particular department we liaise with. Whereas with Arsenal, it's across different departments.
‘They are leaders in this space. The systems they have in place, their level of support for players, the action they have taken to ban members and season-ticket holders and, crucially, communicating these bans.’
In the aftermath of their defeat to United, Arteta emotionally addressed the vitriol targeted at Havertz and his wife.
‘It’s incredible, honestly,’ he said, visibly seething. ‘We really have to do something about it, because accepting that and hiding this has terrible consequences.
‘We are all responsible. We cannot look somewhere else. That’s a really serious matter. It affects me. It affects him (Havertz) and everybody that is in the industry.
‘We can accept it and say that’s our job, but there are certain limits and the line has to be drawn. What is next in football is that this should be prohibited. It cannot happen. That’s it.’
This public response was a small part of their wider strategy. Mail Sport understands that Signify send the club a weekly report about accounts posting serious abuse and details about the perpetrators. But Arsenal also monitor social media themselves, and can flag accounts for Signify to find the person responsible.
Internally, the club liaise with the stadium management team to identify the individual in question through CCTV and see if they have a membership. If the Gunners ban a season-ticket holder for five years, for example, that in effective is a 25-year ban because once the ban is elapsed, the person has to go to the back of the season ticket waiting list, which is in excess of 20 years and 100,000 people.
While that happens, the player welfare team goes through a process with the affected player - it applies to both the men’s and women’s team. They can seek criminal prosecution if they wish to, and have access to therapy if need it.
The Gunners take the online sphere seriously. It’s incorporated into their football operation - they see how it can affect player performance. For example, internal research by Signify around NBA/WNBA players found that social media vitriol can have a serious effect on player mental health and, in turn, performance.
Mail Sport is aware of another Big Six club who are clamping down on the matter by installing blocking systems on phones so players can only see comments from accounts they choose to follow. The club also offer increased support to players during periods where there is increased abuse, such as when the team is performing poorly.
At the Euros in 2021, Arsenal star Bukayo Saka and his England team-mates Marcus Rashford and Jadon Sancho were all targeted by racist vitriol online for missing penalties in the loss to Italy in the final.
Some trolls were using emojis instead of racial terms to try to go undetected. At the time, it was quite a new trend. That eventually failed over time because the likes of Signify use machine learning, a type of AI which automatically identifies patterns, makes decisions, and improves itself through experience and data.
‘Context and nuance are crucial for accurate analysis,’ said a Signify spokesperson. ‘Somebody saying, “X player should have shot more” and “X player should be shot” require entirely different analysis.
'We use machine learning and AI to analyse the huge scale of comment aimed at players, officials, and clubs and sift what’s relevant from the noise.
‘However, to ensure accuracy we will always deploy a human element for assessing the context, gravity of abuse or discrimination, and crucially the reality of threat.’
In essence, it’s a two-step verification process before the company actually deals with something. The human verification checks: is that in context, it is discriminatory and it is actually threatening.
So the focus is on targeted abuse and targeted threats at the player, a fan, or a club staff member; they are not seeking to police free speech or aiming to change football fan culture.
Another important check: is it physically possible to carry out the threat? In other words, assessing the perpetrator’s location in proximity to the individual in question.
The company then uses open-source intelligence (OSINT) techniques to track down and identify perpetrators. It’s a cat-and-mouse game in which the culprits are ever-evolving.
OSINT means using databases that are available to the public. Examples include social media monitoring, web scraping and public record checking.
They don’t breach privacy settings (such as sending a follow request to a private account), use surveillance technology or act in clandestine ways.
What helps them is that even if a perpetrator is hiding behind a private account, their comments are in the public domain when they tag in the athlete handle.
This then makes their username visible and is a starting point for Signify to find out more about them.
One pattern that has been picked up is frustrated gamblers targeting an athlete with abuse before a match to disturb their psyche before a performance.
X is the most common platform this happens on. It is widely seen as ‘the platform of live sport’ because of the endorphin rush fans get from being first to information and getting highlights immediately.
TikTok, Instagram and Facebook are popular, too, but they have not cracked this area in the same way as X.
Overall, the football landscape in the online sphere has become increasingly toxic.
If every club across the football league was to fully engage with tracking down and tackling serious vitriol, fanbases on the whole would likely be less contaminated.
It does appear that such change in the top flight would require help externally from the league, though. The Premier League have their own external investigations team to deal with social media abuse. However, Mail Sport understands that some clubs don’t have faith in these, which is why they turn to the likes of Signify.
There is some way to go on the pitch for Arteta’s men, but off it, Arsenal are well set to continue leading the fight in tackling online abuse in football.
ninaclmrty
0
Banter club Netflix FC 🤣😭😂
5 :1
malablata
0
It's Arsenal, always lead the way positively. First team to play with name at the back of their jersey, first team whose match was broadcast live on television. Now, first team to use AI and track down troll. We are not called Mighty Arsenal for nothing......we lead and others follow.
gigi07
0
Hahaha the banter club is crying because rival fans speak the truth about how pathetic and sad the Bottle merchants actually are [Crylaugh]
The truth will always prevail 🍼🤡
[image]
says a relegation club🚮😂
lehaceiotz
1
nice move... Chelsea follow suit
Dapcmostz
0
Banter club Netflix FC 🤣😭😂
Niebimopsy
0
dslsjzdjfd
Niebimopsy
0
⁹
Niebimopsy
1
f
The_Red_DeviIs
3
Hahaha the banter club is crying because rival fans speak the truth about how pathetic and sad the Bottle merchants actually are [Crylaugh]
The truth will always prevail 🍼🤡