Engadget
Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Facebook has banned QAnon

Facebook previously only targeted pages and groups that discussed violence.

Rick Loomis via Getty Images

The QAnon conspiracy theory is no longer welcome on any of Facebook’s apps. The social network is banning QAnon entirely even if the accounts in question aren’t explicitly discussing violence, Facebook now says.

“Starting today, we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content,” the company writes in an update. “Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports.”

The new policy is Facebook’s most drastic step yet to beat back the conspiracy theory, which the FBI warned last year could be a domestic terror threat. The company has previously taken smaller steps to limit the conspiracy theory’s spread, but up until now had stopped well short of banning it entirely. In August, the company said it would ban QAnon pages and groups when they discuss violence, and last week said it would ban QAnon ads.

Those earlier crackdowns had resulted in the removal of thousands of pages and groups, but didn’t address all the ways that the conspiracy theory causes “real world harm,” according to Facebook. “While we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public,” the company says. “Additionally, QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another.”

While once considered a fringe movement, belief in QAnon has surged during the coronavirus pandemic. Much of that growth has been helped by Facebook’s recommendation algorithms.

The company notes that it will take some time to ramp up its enforcement of the new policy. Another challenge is that supporters of QAnon have been adept at evolving their message in order to lure more followers. For example. QAnon believers have also latched onto anti-vaccine and COVID-10 conspiracy theories. More recently, the group has latched onto the issue of child trafficking as a recruitment tactic.

With the ban, QAnon followers will have even fewer large platforms available to use openly to expand their reach and grow a following. Reddit banned the conspiracy theory in 2018, and Twitter banned thousands of QAnon accounts earlier this summer. TikTok has also taken steps to limit its spread by banning hashtags associated with the movement.