Skip to main content

Facebook begins banning accounts that openly support xenophobic QAnon group

Designated by FBI as domestic terror threat in 2019, US President Donald Trump applauded group in August, calling followers 'people that love our country'
Trump supporter wears QAnon sweatshirt during one of president's rallies on 3 October in Staten Island, New York (AFP/File photo)

Facebook has begun removing pages that openly support the QAnon movement, a baseless and xenophobic conspiracy theory that claims a network of "deep state" actors is working behind the scenes against President Donald Trump to create a new world order. 

The company, which also owns Instagram and Whatsapp, said on Tuesday that it will remove pages, groups and accounts for "representing QAnon", even if they don't promote violence, which had been the previous requirement for a ban. 

A variety of factors will be used to determine if a particular page meets the criteria for the ban, including its name, the biography or "about" section, and the discussions that take place in the space. Merely mentioning the group on an otherwise unrelated page or account will not merit a ban, Facebook said. 

The company added that administrators of banned groups will have their personal accounts disabled as well. 

Masks to Sharia: QAnon is spreading anti-Muslim ideology via coronavirus opposition
Read More »

For years, Facebook has been viewed as the main platform used to spread QAnon ideology. 

The FBI labelled the group a domestic terror threat in 2019.

The movement started as an online following surrounding a person going only by the name "Q", who first appeared on internet message boards during the 2016 presidential elections. Q claimed to be a government employee with top-level security clearance, dropping anonymous "information" online - what followers call "breadcrumbs". 

The first theory to grow out of QAnon was dubbed "pizzagate". It accused a "satanic cabal of elites" - made up of Hollywood figures and world leaders - of running a secret global paedophile ring headquartered at a popular pizzeria in Washington DC. 

In December 2016, one of the group's followers, a 29-year-old man from North Carolina, travelled to the pizzeria and fired a military-style assault rifle inside, wrongly believing he was saving children trapped in a sex-slave ring. In June 2017, he was sentenced to four years in prison.

The group's latest agenda revolves around challenging the legitimacy of the coronavirus threat, including spreading the unfounded claim that 5G cellular networks were behind the pandemic and spreading a conspiracy that mask-mandates are a secret plan to bring Sharia law and Muslim-style dress to the United States. 

The group's rhetoric often revolves around anti-Muslim and anti-Jewish conspiracy theories, including the antisemitic trope that a Jewish "shadow network" controls the United States. 

'Impossible to get them to unbelieve it'

Despite its wildly xenophobic rhetoric and FBI threat designation, the group has gained more traction in recent months, with several top GOP figures expressing support, including Trump, his first national security adviser Michael Flynn and scores of congressional candidates

"I've heard these are people that love our country," Trump said of the group during a White House news conference in August. "So I don’t know really anything about it other than they do supposedly like me."

In July, Middle East Eye spoke to Richard Hanley, a journalism professor who has for several years taught a class on the spread of disinformation and the conspiracy theory at Quinnipiac University in Connecticut. 

At the time, Hanley said QAnon accounts had been encouraged by Facebook's algorithm to such an extent that he believes the platform is almost exclusively to blame for the ideology's wide reach. 

Anti-Muslim activist wins Republican House primary in Trump's Florida district
Read More »

"It's Facebook, specifically," that is the problem, Hanley said. "Facebook's algorithms promote QAnon because if you 'like', for example, an [anti-vaccination] page, you're going to get a recommendation to 'like' a QAnon page - because the two are closely linked and allied, based on the machine-learned part of the algorithm. So once it gets into the recommendation engine, it spreads wildly." 

In a July email to MEE, a spokesperson from Facebook said the company had been trying to quell the group's messaging on its platform by "closely monitoring" its activity and taking down accounts, in addition to re-thinking how its policies apply to the issue.  

At the time, Facebook said it was only removing QAnon groups if they promoted violence, but that is no longer the case.

While the company's campaign to take down QAnon pages began on Tuesday, it warned that it "will take time and will continue in the coming days and weeks". 

"We've seen several issues that led to today’s update," Facebook said in a blog post updated on Tuesday. "While we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public."

Still, Hanley warned that even if Facebook were to change its policy like it has, the damage has been done. 

"It's impossible to get them to unbelieve it, because once that idea is embedded, it becomes part of their cognitive bias spectrum," Hanley said, comparing QAnon believers to cult victims. 

Middle East Eye delivers independent and unrivalled coverage and analysis of the Middle East, North Africa and beyond. To learn more about republishing this content and the associated fees, please fill out this form. More about MEE can be found here.