Do extremists use Discord? In online gaming chatrooms, hate isn't hard to find.

In April, the intelligence establishment appeared confounded by news that a 21-year-old Air National guardsman had shared top-secret military documents with fellow chatroom members on a social networking platform called Discord, which is popular with online gamers. But I was not surprised.

For months, I have roamed Discord’s private “servers” – chatrooms under the authority of a volunteer administrator who unilaterally sets rules and decides who can join. One server’s rules stated: “No disrespecting any of the Fascist champions. Saying (expletive) like ‘Hitler was a coward who shot himself’ will get you insta banned.”

The members interrogated me: “What brings u here? do u have firearms? It’s better if you have them, for the coming war.”

Once I knew where to look, servers that explicitly advocated for extremist measures, including violence, were easy to find among the vast collection of Discord’s private chatrooms.

Russian operatives detected spreading propaganda in chatrooms

It also did not come as a surprise to me when Microsoft President Brad Smith revealed a few days after the news of the military-document leak that his company’s threat-analysis team had detected efforts by Russian operatives to disseminate anti-Ukraine propaganda via Discord, as well as on Steam, another site where gamers go to purchase and discuss video games.

Wrongfully detained: USA TODAY stands with Evan Gershkovich. Russia's arrest of WSJ reporter raises the stakes.

U.S. national security: America's aid to Ukraine isn’t charity, but we need to track it better

These two recent episodes may seem unrelated, but are in fact part of an undercurrent of subversive sentiment and extremist activity that have been allowed to fester for too long on gaming and gaming-related sites. I have written about the broader phenomenon in a new report published by the Center for Business and Human Rights at New York University’s Stern School of Business.

Online gaming is a legitimate and potentially valuable form of entertainment for billions of people worldwide. Many games involve simulated violence, but there is also no conclusive evidence that simply playing video games makes someone more likely to commit violence.

Yet there is mounting evidence that bad actors are exploiting gaming platforms to further their objectives and, in some cases, cause serious harm. The gaming industry has generally been slow to counteract such manipulation of their sites. The good news is that there are ways for the industry to curb such exploitation.

Online gaming generated nearly $200 billion in revenue last year, more than the music and movie industries combined. More than 3 billion people worldwide play video games, and a majority of them play online multiplayer games, in which they can interact and communicate with strangers.

In the United States, over 215 million people play video games; this includes 71% of children under 18, according to the Entertainment Software Association.

Mental health: So you have a college graduate, now what? Why a plan forging forward is imperative.

Online predators: Child sex abuse content is exploding online. We're losing the fight against it.

Gaming sites are appealing to extremists because of their principal demographic: highly engaged young people. Today, games and gaming-adjacent sites such as Discord and Steam are among the primary online places where young people socialize and form communities.

Online games provide extremists with easy access to large numbers of adolescents and teenagers, many of whom are unsupervised, highly impressionable and yearning for connection. Adjacent platforms popular among gamers, Discord chief among them, provide the infrastructure for longer-term indoctrination and mobilization.

Extremist groups use online games to identify possible supporters

Researchers have for years documented examples of extremist organizations – including the Islamic State group, white supremacy movements and radical misogynist groups – using online games to disseminate their ideologies and identify potential sympathizers.

Police investigations following recent mass shootings – such as those in Christchurch, New Zealand; El Paso, Texas; Buffalo, New York; and Highland Park, Illinois – have also revealed the shooters’ deep involvement in gaming communities where affirmation of extremist beliefs is widespread.

No one is born an extremist: Riot at U.S. Capitol shows virtually anyone can be swept up by hate groups

Fighting hate: Antisemitism is surging across the US. Biden just took a historic step to fight it.

While the exact prevalence and nature of radicalization in online games remains understudied, largely due to gaming companies’ reluctance to allow independent researchers to see their data, there is sufficient reason to consider it a pressing issue. The problem is not the trash-talk endemic in gaming culture but the extreme forms of harassment and incitement that can lead to real-world harm.

The NYU Stern Center commissioned a representative survey in five of the top video game markets globally. It found that 51% of gamers had come across an extremist statement or narrative while playing online multiplayer games in the past year. These included statements such as, “A particular race or ethnicity should be expelled or eliminated,” and expressions of support for the idea that “using violence is justified or necessary to achieve a political aim.”

The survey also revealed that in the past year, 36% of gamers had experienced some form of acute harassment, such as threats of violence, doxxing, hate-raiding or sexual harassment.

My son died after being cyberbullied. Congress must hold social media giants accountable.

Can we really trust AI? ChatGPT falsely accused me of sexually harassing my students

The gaming industry has a responsibility to protect users from the most serious harms enabled by their products. But for too long, most game publishers and platforms have offered features that enable extremist exploitation without establishing adequate safeguards against the worst abuses.

For starters, gaming companies should ensure that they have enough in-house staff to promptly review user reports of extremist activity, remove offending content to limit its reach and suspend the users responsible. Most companies allow users to flag problematic conduct, but surveys reveal that gamers seldom do so, partly because they have little faith in companies’ capacity to resolve such issues.

Opinion alerts: Get columns from your favorite columnists + expert analysis on top issues, delivered straight to your device through the USA TODAY app. Don't have the app? Download it for free from your app store.

Companies should also develop AI systems to proactively detect and escalate for human review any extremist content relayed via voice chat and user-generated imagery. Many companies stop at text-chat filtering, while their games may involve players communicating via real-time voice chat or creating their own virtual environments.

Gaming-adjacent platforms, like Discord, that offer “private” networking spaces should deploy automated systems that can detect illegal activity without violating users’ reasonable privacy expectations.

Mariana Olaizola Rosenblat is a policy adviser on technology and law at the Center for Business and Human Rights at New York University’s Stern School of Business.
Mariana Olaizola Rosenblat is a policy adviser on technology and law at the Center for Business and Human Rights at New York University’s Stern School of Business.

Finally, companies should allow external researchers to study their sites so that policymakers have a basis for suggesting effective interventions and reforms.

Online gaming provides diversion and even a form of companionship for an enormous international following. It is past time that the companies profiting from this activity take more responsibility for ensuring that it does not facilitate serious harm.

Mariana Olaizola Rosenblat is a policy adviser on technology and law at the Center for Business and Human Rights at New York University’s Stern School of Business. 

You can read diverse opinions from our Board of Contributors and other writers on the Opinion front page, on Twitter @usatodayopinion and in our daily Opinion newsletter. To respond to a column, submit a comment to letters@usatoday.com.

This article originally appeared on USA TODAY: Extremism in gamer chatrooms like Discord? It's easy to find