A number of recent terrorist attack perpetrated by far-right terrorists saw the exploitation of technology and the modern entertainment industry it underpins. Terrorists are using video games and platforms associated with gaming to plan attacks and post manifestos, or draw inspiration from video game aesthetics while conducting attacks. Reports also indicate that extremists are using games and gaming platforms to recruit and radicalise (i.e., ‘groom’) vulnerable individuals.
In response to these developments, attention has increasingly shifted to the role of video games vis-à-vis radicalisation within the field of preventing and countering violent extremism (P/CVE). At the European level, entities like the Radicalisation Awareness Network (RAN) and the European Union Internet Forum (EUIF) are generating knowledge and creating awareness of this exploitation in both practitioner and policymaking circles. Meanwhile, initiatives like the Extremism and Gaming Research Network (EGRN) study the phenomenon of the gaming-terror nexus, developing possible solutions to counter it. Their efforts are reinforced by organisations such as the Global Internet Forum to Counter Terrorism (GIFCT), bringing together industry and government to prevent and combat extremists’ exploitation of technology, including a number of policies established by gaming-adjacent platforms.
Identifying, drawing attention to, and researching the gaming-terror nexus will pave the way for possible P/CVE interventions. Such interventions, however, are particularly challenging. For instance, video games and chat rooms are relatively opaque spaces; it is easy to host private game servers or chat rooms that are consequently difficult to monitor or moderate. Moreover, the target audience for video games comprises – much like other online spaces, such as social media – young people who might be particularly vulnerable to radicalisation. Without adequate audience segmentation and an eye for popular trends in contemporary youth discourse, P/CVE interventions and counter-extremist messaging efforts may escape target audiences. Perhaps most importantly, understanding radicalisation in gaming spaces requires a great deal of knowledge of the social dynamics, cultures, and discourses that one encounters in such spaces. To be effective, interventions should be particularly mindful of these complexities.
This Analysis explores the nexus between gaming and violent extremism and offers a number of design principles that effective P/CVE policies and interventions should adhere to. Specifically, we argue that such policies and interventions should principally empower gamers while also equally focusing on ‘meta-gaming’ issues, that is, particularly problematic aspects of gaming culture, rather than video games themselves.
Why we should talk about video games and violent extremism
Video games are immensely popular. The games industry is estimated to be worth roughly USD 200 billion and a total of nearly 3 billion people worldwide play video games. A substantial portion – 77 percent according to a recent estimate – of gamers play multiplayer games. The popularity of video games makes them attractive targets for extremists to exploit. Games with an online component may facilitate an exchange of extremist viewpoints and content, and indeed, the misuse of video games by malignant actors has been well-documented. In 2014-15, ISIS released custom-made, modified content (‘mods’) for games like Grand Theft Auto V and military simulators (‘MILSIMs’) such as Arma 3. These mods allowed players to re-enact acts of terrorism or take up arms as virtual ISIS insurgents. Within the far-right, developers like the Polish studio Destructive Creations or the German Kvltgames have created games with ostensible far-right messaging, whether it be defending Europe’s beaches against hordes of ‘ISIS-invaders’, or taking up the fight against the ‘Globohomo’ (Globalist homosexual) enemy whilst playing as far-right figureheads.
While these examples provide small evidence of the encroachment of extremism in video games, larger data-driven studies do suggest an increasing trend. The Anti-Defamation League recently published the results of its fourth annual survey on online hate and harassment in which 20 percent of adults reported exposure to white supremacist content in online video games. In their study on extremism and the identity fusion of players within gaming culture, Rachel Kowert, Alexi Martel, and William B. Swann found that specific gaming communities in games such as Call of Duty “may serve as catalysts that encourage strongly fused gamers to embrace antisocial attitudes and behaviors.” Of the gamers surveyed in the study, fusion with gaming identity among Call of Duty players “uniquely predicted more antisocial and extreme outcomes […] including willingness to fight/die for gaming culture, all three Dark Triad measures, extrinsic racism, and recent aggressive behaviors.” On the other hand, gamers playing Minecraft – a pixelated open-world building game – are much less likely to feature such tendencies.
Anecdotal cases similarly suggest that players of real-time strategy (RTS) games are purportedly more vulnerable to the adoption of extremist beliefs. The in-game universe of Warhammer 40k, which has already endured controversy of players openly displaying fascist and Nazi symbols, appeals to some for its fascist imagery. Other games, such as Hearts of Iron IV, even allow players to play as Nazi Germany. It should be noted that no evidence suggests that these games – or any games for that matter – are radicalising per se, but questions should be raised as to how fascist or Nazi aesthetics may inadvertently draw in sympathisers who can project their ideology onto the games they play. In short, more research into specific gaming communities could determine which communities have higher vulnerabilities towards the adoption of extremist beliefs.
One such study currently being conducted by ICCT pertains to the exposure to extremist content in Military Simulators (MILSIMs), a genre of hyper-militaristic and immersive video games in which players emulate real-life military tactics, techniques, and procedures (TTPs). Studying MILSIM communities may further our understanding of the link between particular genres of video games and potential vulnerability to extremist content. The presupposition that such a link exists is not entirely without reason. For one, MILSIMs have already been used by extremist groups to enact terrorist attacks suggests. We should also consider how such environments may become virtual training grounds where players become familiarised with military tactics, techniques, and procedures, as well as military kit and equipment. Second, MILSIM spaces are, owing to their militarised nature, typically hierarchical and male-dominated spaces. This makes them potentially attractive for far-right actors who integrate misogyny, hierarchies of race and gender, and hypermasculinity into their ideological talking-points, and who may thus find some resonance in these spaces. Lastly, radicalisation dynamics, particularly related to anti-governmentalism and white supremacy, have already been identified in offline MILSIM communities (e.g., airsoft or civil war re-enactment groups). This raises questions as to what degree similar extremist sentiments may be found in online MILSIM communities. Overall, research into the role of specific gaming communities and their impact on radicalisation can drive effective P/CVE interventions. Identifying which communities may be particularly vulnerable can lead to more targeted responses and strategies to tackle extremism in gaming spaces.
“Keep your politics out of my game”: How gaming culture may drive radicalisation
As alluded to earlier, the relative opaqueness of video game spaces provides an attractive opportunity to meet online and outside the watchful eye of law enforcement. Moreover, the presence of many young people who may be vulnerable to extremist messaging efforts creates ideal circumstances for exposure to extremist viewpoints. However, we argue that particular aspects of gaming culture may also have a hand in the proliferation of extremist beliefs. In the study by Kowert, Martel, and Swann, “[identity] fusion with gaming culture is uniquely predictive of a host of socially pernicious outcomes, including racism, sexism, and endorsement of extreme behaviors.” Examples of how such tendencies surface from time to time are numerous.
Famously, the Gamergate controversy, which began as a discussion about ethics in video game journalism, slowly devolved into a harassment campaign targeting women in the industry, thus evidencing that misogyny was (and still is) rampant in certain segments of the gaming community, at times even intersecting with the ‘alt-right’. Even when not explicitly linked to extremist ideologies, we see that sexist and racist viewpoints upon which extremist ideologies are often predicated surface in more mainstream discourses on gaming platforms and fora. Video games that feature people of colour, LGBTQ+ people, or women as protagonists are frequently met with cries of “keeping politics out of games”. Further studies would need to be conducted to determine the exact scope and impact of such mainstreamed racism and sexism on the proliferation of online (violent) extremism, but an argument can be made that it is certainly conducive to the further normalisation of white supremacist or otherwise racist and sexist ideological viewpoints. After all, gaming environments are not divorced from everyday realities, including the role of gender norms, violence as entertainment, and racist stereotypes. Online networking spaces, including more radical-inclined spaces, build upon these mainstream tropes and can literally take it to the extreme.
For example, games are not only referenced in manifestos, such as the term ‘techno-barbarism’ in the Halle shooter’s manifesto (a reference to the popular games franchise Warhammer 40k), but these manifestos are also uploaded to forums rooted in gaming culture like the chan sites 4chan and 8kun (formerly 8chan). Moreover, memes created in the wake of several terror attacks feature video game aesthetics. For example, following the Christchurch attack, memes featuring stills of the shooter’s livestream were overlayed with graphics from games like Fallout 3, Doom (1) (2), Deus Ex, or Hotline Miami. The attack was therefore given a distinct aesthetic that is explicitly connected to video games. In this regard, P/CVE interventions should be particularly mindful of how gaming culture is exploited by extremists and, importantly, how resiliency against extremism can be strengthened – for example by addressing sexist and misogynist narratives that often connect these extremist ideologies or by understanding how video games aesthetics can also be utilised to appeal to gamers who are at risk of radicalising online.
The way forward: policy and praxis considerations in developing effective P/CVE responses
Despite the encroachment of extremism in video games and gaming culture, comprehensive responses to address extremism in video games are generally lacking. This can, in part, be explained by the sheer amount of people playing video games, rendering effective monitoring and moderation impossible. Law enforcement agencies simply do not have the resources to police the entire gaming sphere. Yet, at the same time, as the evidence above suggests, the need to address extremism within gaming is pressing.
One way to circumvent the law enforcement bottleneck would be to empower gamers themselves. Where police may lack capacity to patrol online spaces, gamers are in a much better position to identify and address extremism, for example, by using in-game reporting functions. This, however, hinges on the creation of such functions and, more importantly, the investigation of reported extremist behaviours by gaming companies and possible enforcement mechanisms toward users found to engage in such behaviours. As a report commissioned by UNOCT highlights, gamers have expressed that reporting needs to be easier and more transparent. It is not a reliable means for the removal of violating players, nor do players always see if their reports are being picked up on by game admins. Consequently, gamers who encounter extremist behaviour do not always find it meaningful or worthwhile to report instances of such behaviour. Thus, to empower gamers to participate in self-regulating communities requires an expansion of reporting features and for reports of hateful behaviour to be enforced. However, while establishing mechanisms that empower gamers and gaming communities to deal with extremist behaviour in these spaces may be a good first step, depending solely on active bystanders to flag problematic activity brings forth questions of effectiveness, given that the burden of responsibility is then placed on the individual user rather than at the structural level.
Beyond user empowerment, gamers should have their voices heard in the creation of policy frameworks and targeted interventions through the involvement of gaming-related advocacy groups and civil society. Gaming spaces are challenging to navigate for the unversed and cultural cues are difficult to pick up on. Gathering insider perspectives is therefore instrumental in the creation of effective policy responses. Understanding the difference between ‘borderline’ content (i.e., content that blurs the line between edgy and hateful) and outright extremist material is important to reliably detect extremist behaviour online while preventing false positives on content that might be construed as problematic but is not explicitly harmful, or even illegal.
Another related gap is the need for better reporting functions, and more effective moderation by gaming communities and game companies. Gamer advocacy groups must therefore be included if (inter)national authorities wish to create proactive policies and integrated approaches to tackle the gaming-extremism nexus. Best practices should involve holistic approaches that look at both empowering and incorporating perspectives of individual gamers.
Further, to meaningfully change the negative aspects of gaming culture such as racism or misogyny that may drive the adoption of extremist beliefs, interventions should involve more diversity represented at all stages of game development and design, as well as game testers, reviewers, and critics. Gaming live-streamers also contribute to setting cultural and social standards, and, like other types of content creators with a large audience reach, can serve as potential role models for evaluating the gameplay experience and user mods. Supporting gaming streamers who promote inclusive messaging could be a possibility as long as they are able to retain control over the creativity of their content. P/CVE policies need to adapt to the gaming-extremism nexus by incorporating these related elements into approaches.
A more nuanced understanding of the relation between (violent) extremism and terrorism and the role that identity and representation play in video games can lead to targeted, more effective P/CVE policy responses towards gamers. Multi-stakeholder approaches drawn from collaboration between government, law enforcement, and civil society, along with the gaming industry and gaming-adjacent platforms, must consider that creating games that simply incorporate (counter-)narratives or mechanics to counter extremism is not the answer. Such games centre disproportionately on the power of the medium, with the working assumption that it is the medium itself that is both the problem and the solution. Rather, problematic aspects of gaming culture need to be considered when designing P/CVE strategies, which otherwise may fall short of appeal.
Better reporting mechanisms and inclusive representation can be achieved by listening to, empowering, and involving gamers in implementing and enforcing community guidelines. At the same time, we should not overlook the systemic, meta-gaming issues that constitute gaming spaces as a culture characterised by elements of racism, sexism, and misogyny. Combined with a greater understanding of the complex landscape of gaming communities, practitioners who have to operate with few resources may develop more effective interventions by identifying which specific genres may attract more individuals with extremist attitudes and behaviours. This may benefit new and existing P/CVE initiatives that are active in gaming spaces by tailoring to specific target audiences.
The set of recommendations that we offer are, by and large, contingent on close cooperation with the gaming industry. Incorporating better reporting mechanisms or collaborating with positive role model streamers relies on a willingness from game developers and tech platforms to do so. The list that we offer is therefore not exhaustive and requires constant revising. Where cooperation is difficult to attain, legislators are responsible for creating the necessary policy frameworks that align private with public interests. Legislation like the EU’s Digital Services Act or the UK’s Online Safety Bill are a step in the right direction. These will require tech platforms, including gaming-adjacent platforms, to remove extremist (or otherwise harmful) content if flagged by relevant national authorities. However, these new laws are reactive, not proactive. They respond to instances of (violent) extremism once they have already occurred. In this regard, we hope that the solutions that we outline – in particular, empowering gamers and effectuating cultural shifts through positive role models – offer more sustainable approaches.