After the Labour Party’s victory in the British general elections on July 4, 2024, and Keir Starmer’s appointment as Prime Minister, Britain is witnessing a rising wave of far-right violence spreading across the United Kingdom. This surge followed a stabbing incident in Southport on July 29, 2024, that resulted in the death of three girls. Misinformation, which was unverified, circulated afterward, alleging the involvement of a Muslim immigrant in the attack. Social media has played a significant role in this tumultuous scene, acting as a catalyst for violence and hate speech against others.
Inciting Violence
An analysis of the role of social media in the recent violence in the UK reveals several key features:
Lifting Bans on Far-Right Figures: Many analyses indicate that one of the main reasons for the far-right’s prominence is Elon Musk, the CEO and owner of Platform X (formerly Twitter), lifting bans on numerous far-right figures previously banned for violating hate speech rules. Prominent figures include English Defence League founder Stephen Yaxley-Lennon, known as Tommy Robinson, and Katie Hopkins. Since Robinson’s account was reactivated on Platform X in 2023, his presence on social media has increased. His recent marches in London attracted tens of thousands, and his followers on Platform X now exceed 800,000, giving him a vast reach across social media. Following recent events, Robinson regularly posts hate-inciting messages against Islam and immigrants, shares videos of the unrest, and encourages joining future protests.
Spread of Misinformation: Protests initially erupted due to the stabbing deaths of three girls aged six, seven, and nine, and the stabbing of others in Southport on July 29, 2024. Social media was exploited to spread misleading, hate-inciting information about the 17-year-old suspect’s identity, helping to mobilize people widely. Shortly after the Southport attack, a pseudonym began circulating on X with posts alleging the attacker had recently arrived in the UK by boat and was Muslim. The day after the attack, the fake name received over 30,000 mentions on X from more than 18,000 different accounts, including verified accounts, according to the Institute for Strategic Dialogue. Infographics promoting protests in Southport and Whitehall were shared on TikTok and Telegram, while protest details were organized on X. Content recommendation systems and algorithms, which typically promote posts with high engagement levels, amplified the misinformation about the attacker. Charged with three counts of murder and 10 counts of attempted murder at Liverpool Crown Court on August 1, it was later revealed that Axel Rudakobana, born in Cardiff to parents who immigrated from Rwanda, was neither Muslim nor an immigrant.
Amplifying Far-Right Voices: Posts inciting hatred against Muslims and immigrants spread across social media, fueled by early interventions from numerous far-right influencers and conspiracy theorists, including Robinson, his ally Danny Tommo, and Reclaim Party leader Laurence Fox. These posts circulated widely on Platform X for several hours despite violating the platform’s stated policies. Organizations like Hope Not Hate tracked over 30 far-right protests posted on social media, amplifying extremist voices and ideas, making it easier to target hotels housing asylum seekers.
Facilitating Hidden Communication Among Far-Right Groups: Social media played a key role in enhancing networks and links among far-right groups and individuals, providing an easy, fast, and inexpensive means to express and stir shared issues, and to organize protests unexpectedly without much prior arrangement. This poses an additional challenge for police capabilities. These platforms are also exploited to gain more supporters through viral online posts, rapidly spread via recommendation algorithms on platforms like TikTok and X, and through Telegram channels created to express their views.
Interlinking Alternative and Mainstream Platforms: Experts noted that far-right adherents created links between alternative platforms like Bitchute, Parler, and Gab, which offer more freedom for expressing their extremist ideas, and mainstream social media platforms, allowing them to expand their target audience. This interlinking raises questions about the limits of content regulation on alternative platforms.
Alignment with the Decentralized Nature of Far-Right Networks: Prime Minister Starmer faces the challenging responsibility of curbing the unrest, especially given the complexity of far-right networks, which have evolved from formally organized methods like the defunct British National Party to fragmented, personally driven groups. Instead of a single entity that can be banned or punished, the police now face an “unclear opponent” using social media to manage operations and organize protests. This decentralized nature of far-right activity complicates monitoring and tracking efforts.
London’s Response
The supportive roles of social media in far-right activity press the British government to seek policies for managing these platforms. The British response can be summarized as follows:
Starmer Accusing Social Media Platforms of Violence:
Prime Minister Keir Starmer accused social media companies of responsibility for violent protests in London during a press conference at Downing Street. He stated, “Violent unrest clearly incited online is also a crime. It occurs in your premises and must comply with the law everywhere.” In his televised address, the Prime Minister emphasized the importance of government and tech companies “working together” to keep the country safe, stating that “blaming everyone and pointing fingers” is ineffective and that “a discussion needs to be had” about companies striking “the right balance.” Home Secretary Yvette Cooper echoed these accusations on BBC Radio 5 Live, declaring that social media platforms have put “rocket boosters” on content promoting far-right behavior, and that “these companies will bear some responsibility for this.” The criticism extended beyond government members, with prominent politician Rory Stewart criticizing Elon Musk’s claims. Stewart questioned Musk on his platform, asking, “Since when did you claim to understand British society or politics? How many days have you spent with these communities?” He added, “Did it ever occur to you that this might be the wrong time to discuss a topic you know nothing about?”
Attempts to Enforce Legal Framework for Social Media Platforms:
However, it’s evident that this approach will not work with some platforms. Elon Musk ignored the British Prime Minister’s warnings, responding with exclamation marks to a post by far-right activist Tommy Robinson on X. Musk also expressed his opinion in response to a post blaming mass migration and open borders for recent unrest in Britain, warning that “civil war is inevitable in Britain.” Starmer’s spokesperson said there was “no justification” for Musk’s comments. Musk later criticized Starmer for a post on X identifying mosques as needing special protection, indicating Musk’s involvement in spreading misinformation and a lack of genuine intent to combat it. This may necessitate stricter controls by the British government in the future, including enforcing the Online Safety Act 2023, which places specific responsibilities on social media companies to ensure user safety on their platforms. Although the UK passed the Online Safety Act nearly a year ago, the regulatory body responsible for enforcing the law, Ofcom, announced it could not take action against social media companies as the law’s provisions had not yet fully come into effect. Ofcom also stated it would need to consult on “codes of practice and guidelines” before imposing specific sanctions on social media companies for hosting harmful content, excluding sanctions on individual posts or accounts. As a last resort, the British government may rely on defining crimes under the Public Order Act 1986, the primary legislation penalizing violence and/or intimidation by individuals or groups. Adopting this definition could equate these crimes or intimidation, whether online or in real life, as the Online Safety Act does not provide additional support to existing criminal law covering incitement to violence incidents.
Direct Meetings with Tech Company Representatives:
On the other hand, UK Technology Minister Peter Kyle met with representatives from social media platforms, including TikTok, Meta, Google, and X, to remind them of their responsibility to stop the spread of hate, racism, and incitement to violence, emphasizing the need for prompt action in handling the substantial amount of circulating content.
Increasing Calls for Fines on Social Media Platforms:
The professional body for the British Computer Society called for Ofcom to impose fines on Platform X, considering the platform’s disregard for public safety in hosting such hate and violence-inciting content. Such calls may increase in the coming period given Elon Musk’s policies.
In conclusion, the events in Britain may be part of a global phenomenon of the resurgence of populism and far-right politics aimed at bringing profound changes to state structures. The brutal killing of children in Southport has been exploited to incite societal hatred toward Islam and immigrants, fueled by misinformation spread by far-right influencers and social media platform owners themselves, who seek to inflame tensions. This scenario could be repeated in other countries if the appropriate environment for it exists.