At this fall's inaugural Eradicate Hate conference, held just 10 months after the Jan. 6 insurrection showed how deeply hateful ideologies pervade U.S. society, experts in violent domestic extremism discovered something surprising: hope.
After years of frustration and alarm, several experts agreed they could be on the cusp of establishing ways to deter people from extremism and pull individuals out of hate groups.
The new approaches come as President Joe Biden has prioritized tackling homegrown extremism after years of denial under former President Donald Trump. Meanwhile, far-right extremists have been cowed by criminal prosecutions stemming from the insurrection, as well as pioneering civil lawsuits that hit them where it hurts: their bank accounts.
“Doors are open, in ways they haven't been before, to try to more directly confront these threats,” said Jared Holt a fellow at the Atlantic Council's Digital Forensic Research Lab who researches extremism.
“There's all kinds of tools at our disposal that we can use to fight this stuff, and there's a political will to do so right now," he said. "So, as long as people can maintain a clear vision and momentum against it, I’m hopeful of what we can accomplish in the next year.”
White supremacists lose major lawsuit:Jury awards plaintiffs $25 million in lawsuit against white supremacists behind violence at Charlottesville 'Unite the Right' rally
Proud Boys at anti-vax protests: With Trump in the rearview mirror, Proud Boys offer muscle at rallies against vaccine mandates, masks
Two new efforts stuck out to experts.
Moonshot, a company based in London, has figured out how to leverage the much-maligned algorithms that govern social media and online ad platforms, using them not to sell products, but to redirect users who are headed down hateful paths.
And the Polarization and Extremism and Research Innovation Lab, or PERIL, in Washington, D.C, working with the Southern Poverty Law Center, has developed interventions that it says turn would-be extremists away from the movement and help parents and caregivers stop young people from embracing hateful ideas.
The U.S. still faces significant challenges in battling violent, homegrown extremism, experts say. The movement is in constant flux, with threats from white-supremacist-friendly groups such as the Proud Boys, unauthorized militia groups such as the Oath Keepers, and people radicalized by conspiracy-laden movements like QAnon.
But the people tasked with understanding and countering those groups believe there is light on the horizon.
“We finally have resources: skilled hands like PERIL and Moonshot who are getting at efforts to de-radicalize people and draw them out of groups,” said Heidi Beirich, co-founder of the Global Project Against Hate and Extremism. The Biden administration “has said that they've got to tackle anti-government white supremacy, and we've never seen that before. We've seen agencies say this is a threat, but we've never seen a whole-of-government effort.”
Moonshot: Going where government can’t
The U.S. government has always faced a dilemma in tackling domestic extremism: It isn’t supposed to monitor people based on their ideology, nor should it track movements simply because their views are offensive to most people.
Law enforcement agencies are not allowed to target extremist groups unless they have broken the law or are about to.
But there’s nothing to stop private companies, organizations or individuals from monitoring hateful and extremist content. Organizations such as the Anti-Defamation League and the Southern Poverty Law Center have teams of researchers to monitor and expose hate groups and individual extremists.
Now, one company is taking things a step further.
Moonshot, a tech startup founded in 2015, has pioneered an effort to “redirect” individuals who search for extremist content online, by presenting them with alternatives meant to slow or stop them from developing extremist ideologies.
To do this, Moonshot employs the same targeted marketing that sells products.
Working with the Anti-Defamation League, Moonshot compiled keywords and phrases associated with extremism. The company bought advertising on Google and other platforms that deliver targeted ads to people who search for particular terms.
If, for example, someone Googles "great replacement theory,” Moonshot’s advertising would provide that person with videos, news stories or academic papers about how the racist theory has long been discredited.
“We're just using the same commercial advertising tools that are available to any big brand – to Coca-Cola, to Adidas, any big brand – that's trying to reach their customers,” said Vidhya Ramalingam, Moonshot’s founder and CEO. “It’s just that for us, the customer base, so to say, is people who are at risk.”
The company also has been working with Facebook to evaluate the effectiveness of the social media giant's "Search Redirect" effort, which is a similar initiative on the platform. When users search for hateful content on Facebook, the platform redirects them to alternative content that counters extremist narratives.
Those methods alone "will not solve all the problems with social media algorithms," Alex Amend, a company spokesman, wrote in an email. "We’ve consistently called on companies to do more in terms of demonetizing and deplatforming harmful accounts."
Don't tell people they're wrong; just provide help
A January report from Moonshot and the Anti-Defamation League showed promising results from a three-month, nationwide pilot program in the run-up to the 2020 election. The company recorded about 56,300 “high-risk” searches and showed alternative ads about 34,000 times, the report states.
People engaged with the ads more than 1,300 times and watched more than 2,000 hours of video created by credible third parties, the report says.
Besides redirecting people to content that counteracts extremist messaging, Ramalingam said Moonshot is working on ads offering help like mental health counseling.
Based on research, “we know that these audiences are more open to receiving offers of help than they are open to being challenged on their ideology,” Ramalingam said. “If you offer services, it offers the chance to take it from an online interaction into something that's long-term and sustained.”
Holding Moonshot accountable
Though extremism experts think Moonshot's work is exciting, several said they’re worried about the data being gathered by the for-profit company.
Courtney Radsch, an author and expert on the intersection of technology and civil liberties, said Moonshot should be open to scrutiny given that the company is receiving federal government funding while doing work the federal government can’t do.
While the development of Moonshot's "Redirect Method" wasn't itself funded by the government, the current version tackling would-be extremists is funded through a violence and terrorism prevention grant from the Department of Homeland Security.
“I think the biggest issue is their linkages with governments," Radsch said. “Does that data get used for anything else? Is what they are learning being used to better sell products? The data that they collect, do governments have access to it?"
Amend said the company does not collect personal information about the people it aims to help.
He pointed to the company's annual external human rights audit as evidence that Moonshot operates as transparently as possible, and he said the company adheres to European privacy guidelines.
“We’re targeting searches, not individuals. We don’t know who these people are,” Amend said. “We don’t want Big Brother. We don’t want people getting spied on.”
Seven minutes to gain immunity against propaganda
While Moonshot has been working to reach potential extremists before they descend into online rabbit holes, researchers at PERIL and the Southern Poverty Law Center have taken a low-tech approach.
“We're asking a different question, which is not can you interrupt radicalization, but more, can you teach people to recognize propaganda for what it is?" said Cynthia Miller-Idriss, PERIL's director. “Can you inoculate people against propaganda in a kind of digital literacy way?”
Over the last 18 months, Miller-Idriss’ team has been working on 10 projects to create materials that provide a counternarrative to the propaganda spread by extremist groups.
Researchers have, for example, created a guide to help parents and other caregivers understand and act on early signs of radicalization. PERIL says its research shows this simple tactic has proven extraordinarily effective.
Miller-Idriss’ group surveyed more than 750 parents and caregivers who read their 13-page guide to extremism. After studying the guide for just seven minutes, "participants significantly improved their knowledge and understanding of extremism and youth radicalization in ways that make it more likely they would act appropriately," Miller-Idriss wrote in a USA TODAY opinion column in May.
More than 80% of the study participants told researchers they were definitely or probably prepared to talk to young people about online extremism and to intervene with people interested in extremist ideas, the researchers concluded.
'Nobody likes to find out they're being manipulated'
The PERIL team created a series of short videos meant to turn people away from extremist ideas.
The approach builds on research from post-World War II Germany. Rather than attack the messages spread by extremists, the videos and other materials seek to teach people the insidious methods employed by extremist recruiters and propagandists. The theory is that young people are more likely to react to being told that they are being manipulated and lied to than they are to criticism of extremist messaging itself, she said.
Miller-Idriss equates it to talking to teenagers about tobacco or fast food.
“Seventy years of public health research has shown that it's more effective to teach people about how they're being manipulated by disinformation or false advertising or propaganda than it is to try to come at them with facts,” Miller-Idriss said.
“Nobody likes to find out they're being manipulated, especially not teenage boys, right?” she said. “We're not going to tell you you have to act differently or think differently. But we want you to have the full picture of information that there are people out there trying to manipulate you, and when they use these kinds of tactics, just think twice about it.”
The changing fight against extremism
While several experts expressed optimism about the climate for battling extremism in the United States, none believe the threat has waned.
Organized hate groups may be battered and bruised by lawsuits and indictments, but their supporters haven’t suddenly had a change of heart, said Oren Segal, vice president of the Anti-Defamation League’s Center on Extremism.
“I think we're in an interesting moment where there's a lot of hope – a lot of thinking of solutions, a lot more mechanisms than in the past four or five years. But you also have the other side regrouping, and with a similar level of support,” Segal said. “I feel like the challenges have gotten harder in some ways.”
Segal and other experts expressed concern about the mainstreaming of hateful narratives, misinformation and disinformation. He cited a recent video by Fox News anchor Tucker Carlson that misrepresents the events of Jan. 6 as an example of how fringe ideas are being pushed by media figures and politicians with massive audiences.
“That’s a recognition of how deep this is in our culture, and that the responses are going to need a whole different set of skills and even ambitions,” Segal said. “This is not the same fight against extremism as it was 20 years ago.”
Capitol riot misinformation persists: False claims continue to circulate on Facebook
This article originally appeared on USA TODAY: White supremacist propaganda is difficult to combat, but there's hope