Published On: January 19, 2024

This article by Henry Parker, Head of Government Affairs, and Dr. Joe Ondrak, Research, Technology and Policy Lead for Logically, highlights the growing threat of online misinformation, particularly in the blurred line between digital and real-world impact. It points to social media platforms as facilitators of harmful conspiracy theories, with hostile states leveraging these narratives for societal polarisation and extremist groups shaping a new digital path to radicalisation. The article advocates for a public policy response, emphasising the Online Safety Act 2023, and stresses the need for a proactive strategy addressing the link between conspiracy theories and real-world harm in the digital space.

Can we ever know where the online world stops and the offline real world begins? This question is at the centre of the debate on whether we should be paying more attention to the huge volumes of misinformation and disinformation circulating online. Understanding the

relationship between the potentially harmful content a person sees and how they choose to act in response to it, is a critical capability if we are to build preparedness against such threats. Misinformation (erroneous ‘facts’), disinformation (deliberate dissemination of ‘alternative truths’) are an increasing threat. Whilst there is work to be done to better understand the behavioural mechanisms at play, what we can say now, with some certainty, is that some people do act and react to what they see on social media. This has the potential to cause harm both online and when it spills into the offline world.

Whether by accident or by design, social media platforms effectively encourage widespread exposure to conspiracy theories. Though often dismissed as harmless, belief in conspiracy theories often results in discrimination towards one or more groups. This provides fertile shared ground between conspiracists and those looking for a focus for their political or ideological action.  Because they thrive on a narrativised and connected worldview, conspiracy theories themselves are vulnerable to exploitation as a form of mis- or disinformation, or the insertion of additional narratives by different actors.

Social media accelerates this process. Researchers at Berkeley, University of California trained an algorithm to detect and classify conspiratorial content on YouTube. Words such as ‘deep state’, ‘hoax’, and ‘Illuminati’ were classified as conspiratorial. The researchers then looked at the first 20 recommendations from a set of over a thousand popular YouTube channels. The results were clear: over three months, if you watched a conspiracy theory video, the likelihood of receiving a recommendation to watch another one was about 70%. This is how rabbit holes are formed, and they are quite easy to fall into.

If that were not enough, we also see increasingly sophisticated disinformation efforts – or ‘influence operations’ – from hostile states. Such efforts are rarely the major interventions in democratic processes that create so many headlines, or form the plots of highly entertaining TV mini-series. Instead, they are constant and ongoing campaigns of activity, and they too are designed to appeal to a conspiracy mindset and to slowly erode the boundary between reality and alternative versions of the truth. Hostile state actors, specifically Russia, are aware of and directly appealing to a variety of domestic online groups that are receptive to their messaging. These campaigns are filling the ‘trust gap’ that domestic fringe groups often have in relation to more mainstream explanations of the way the world around them operates.

In a June 2023 study, Logically spent two months tracking 15 conspiracy narratives that had been seeded by Russian sources, and found purchase in fringe online groups within the UK. The narratives can be linked to 50 distinct items, or groups of content, that have circulated between Russian state or pro-Kremlin sources and domestic fringe channels. These campaigns specifically aim to exploit conspiracist mindsets, with the overall goal of promoting societal polarisation.

Why we should take conspiracy theorists seriously

It is the interaction between conspiracy communities and extremist groups that gives us most cause for concern. Logically has been working with Coventry University and the Commission for Countering Extremism to explore the question of whether certain content types can fuel radicalisation. The partnership has designed a suite of training for Prevent practitioners, Counter-Terrorism officers, exploring the impact of misinformation and disinformation and how they can feed radicalisation as well as the scope digital channels have to change the entire landscape of adherence to extremist ideologies.

Logically has also comprehensively mapped the types of conspiracist content and narratives that are cross-posted between conspiracy communities and extremist groups, and the common discourse that is often shared between these two seemingly distinct communities. This work is helping to build an understanding of the pathways to radicalisation that form through the easy spread of content, exposing users to increasingly hard ideological stances.

What has become very clear is that the route to radicalisation – and the real-world harm it causes – is changing because of the dynamic of online channels cross-fertilising ideas, ideologies and narratives. The route to radicalisation is increasingly digital in nature. The lines between what many people would say are harmless fringe interests and radicalisation are blurring. The emergence of the digital space means that the way that radical groups form and organise themselves has fundamentally changed.

In the first instance, this space offers a far broader set of routes for people to access content, with social media being just one of these. As a result, it is much more difficult to trace a clear path to radicalisation than it has historically been. Individuals used to join groups and become indoctrinated through interaction with those groups. Now they join social media channels, many of which have international reach and a complex web of interactions. Radicalisation is a much less linear process than it once was.

Increasingly, it is extreme right-wing ideology that is driving real-world harm, and evidence of this is clear to see. There have been six extreme right-wing-inspired terror attacks in the UK since 2015, most recently Andrew Leak in Dover, who attacked a migrant centre. In December 2022, MI5’s Annual report noted that security services had disrupted 37 late-stage plots, a third of which were inspired by right-wing ideologies. Over the period 2015-2022, referrals of those aged under 20 years to Prevent because of extreme right wing beliefs have risen to 12% of the total, from 5% at the start of the period. Referrals for Islamic radicalism, on the other hand, have now fallen from just under 40% in 2015, to just over 5% today. There have been some high-profile convictions, such as that of Harry Vaughan and the Oaken Hearth Group, that were driven in part by extreme right-wing content accessed through digital platforms.

Conspiracy theories in practice

One distinct genre of online conspiracy theories online that is creeping insidiously into the wider popular consciousness – and being leveraged by extreme right-wing groups to attract people towards their ideas and beliefs – is often referred to as the ‘Great Reset’. Its central idea is that global elites planned and stage-managed the Covid-19 pandemic to bring about a world government. The sudden growth of the theory offers a stark warning – from a baseline of almost zero in January 2020, total online mentions of conspiracy theories associated with the Great Reset rose to over 33million by December 2022.

To many of us, the ideas behind the Great Reset are instantly dismissible, but its adherents believe they are facing an existential threat that must be opposed. Because of the fear driven by the underlying belief, its supporting narratives can, and are, being skewed towards far-right rhetoric.  A ‘Boiled Frog’ scenario is emerging before our eyes, in the form of gradual exposure of believers to increasingly extreme beliefs couched in language, vocabularies and ideals that resonate with them. Conspiracy theories like the ‘Great Reset’ act as a gateway into more extreme right-wing views. They generate what is essentially ‘grievance’ content that exposes people to extreme ‘bridging’ narratives that offer apparently plausible explanations for societal grievances. Once someone demonstrates an affinity for this kind of content, the dynamics of social media platforms can mean they are fed this over and over again.

Examples of this kind of ‘bridging narratives’ in play are well-documented. As the BBC recently reported, newspapers such as ‘The Light’ now have a circulation of 100,000 copies a month and certain Telegram Channels have 18,000 followers. On Telegram, users mix content on local politics and health and wellness with content from groups like Patriotic Alternative. Matt Jukes, the UK’s Head of Counter Terrorism, was right to say in the same BBC piece that there is “clear evidence of conspiracy theories being interwoven with extremism”. Logically’s work has identified the same narratives crossing from conspiracy sites into extreme right-wing ones and vice versa. Anti-Trans or ‘groomer’ narratives, as well as wider suspicion of the mainstream media, are good examples of infiltration from, and alignment with, anti-minority elements that may result in more agitated demonstrations. Such protests can easily turn violent and can be supplemented by people who previously wouldn’t have been drawn in, were it not for the cross-posting and viral spread of content. A recent BBC survey also found that 61.5% of people who said they would have attended rallies linked to common conspiracy theories, think violence can be justified at protests.

Misinformation and its effects on democracy

The other key cross-over appears to be what are often called ‘Sovereign Citizen’ Groups. This is an ideology driven by the idea that the state has no legal jurisdiction over its citizens, and that adherents therefore have the right to take direct action against the state to reclaim their rights. Clearly this has potentially significant implications for the prevailing democracy in which we live. ‘The Light’ again offers a window into this world – with prominent adverts for groups like ‘Alpha Men Assemble’ which offer military-style training with the tagline “it’s time we show them who rules the country”.

The clear crossover between online misinformation and real-world harm is an area that requires some form of public policy response. The new Online Safety Act 2023 does offer a path towards curbing problematic content. It places a duty of care on social media platforms to try to curb information that is clearly illegal, although it is notable that the law is geared towards mis- and disinformation presenting harm to the individual, as opposed to the kind of societal harm that is starting to emerge from the blurring of ideological divides that we are seeing. It is not at all clear in what circumstances non-state-backed disinformation would or would not be illegal under the Act. This needs to be addressed.

In addition, we need to accept the relationship between conspiracy theories and real-world harm. It is clear the government should not simply treat conspiracy theories as ‘misinformation’ or purely digital phenomena. There is a cross-over here between what happens online and what happens offline. More work needs to be done, but having a proactive strategy for managing radicalisation means properly addressing the digital space. This means understanding how conspiracy narratives form and develop, and systematically tracking how they circulate, and how they are leveraged by or influence those who commit real-world crimes.

Any national resilience strategy needs to take seriously the possibility of something as seemingly esoteric as the ‘Great Reset’ ideology crossing over into the real world and taking people down a radicalised rabbit hole. The 4 P’s of Prepare, Prevent, Protect, Pursue (first established as part of the Serious Crime Strategy) offer a robust framework for developing a joined-up strategy to deal with this problem, and to prepare for the possibility that online radicalisation may be fuelling real-world incidents.

 Henry Parker is Head of Government Affairs, and Dr. Joe Ondrak is Research, Technology and Policy Lead for Logically, a technology company that combines AI with expert intelligence to tackle the impact of harmful online content, including mis and disinformation, at scale. The company provides citizens, governments, and digital platforms with the ability to access accurate information, identify threats and reduce this kind of content can cause.

Share this story

Related posts