Facebook has been criticized for not doing enough to contain a rapidly growing fringe conspiracy movement, and said Tuesday it would remove any groups, pages, or Instagram accounts that openly identify with QAnon.
The change drastically tightened previous guidelines of the social media company. In August, Facebook unveiled its first attempt to limit the spread of QAnon by setting guidelines banning QAnon groups who promoted violence.
But hundreds of other QAnon groups and sites continued to spread across the platform, and the effort was viewed as a disappointment in many circles, including Facebook employees.
On Tuesday, Facebook admitted that its previous guidelines had not gone far enough to address the popularity of the far-right conspiracy movement.
"We've been carefully enforcing our policies and researching their impact on the platform, but we've identified several issues that led to today's update," Facebook said in a public post.
Since the initial ban on Facebook, QAnon followers had found ways to evade the rules. The group dates back to October 2017 but saw the biggest increase in followers since the pandemic began.
At its core, QAnon is a sprawling movement that falsely believes the world is ruled by a cabal of satan-worshiping pedophiles planning against President Trump. It has branched out into a number of other conspiracies, including doubts about medical advice for dealing with the pandemic, such as wearing masks.
On Facebook, QAnon has drawn new followers by using tactics like renaming groups and reducing messaging to make it less disruptive. A QAnon campaign to co-opt health and wellness groups and discussions about child safety have drawn thousands of new people into their conspiracies in the past few months.
Researchers studying the group said QAnon's postponing tactic originally helped circumvent Facebook's new rules, but that the guidelines announced Tuesday would likely tighten the screws on the conspirators.
“Facebook has made a significant contribution to QAnon's growth. I'm surprised it took the company so long to take these types of action, "said Travis View, host of QAnon Anonymous, a podcast designed to explain the movement.
As QAnon has become a major source of misinformation on a number of topics, the action announced by Facebook is likely to have far-reaching implications in slowing the spread of misinformation on Facebook and social media in general. ”
Almost 100 Facebook groups and pages, some with tens of thousands of followers, have already been affected by the changes, according to a survey by the New York Times using Crowdtangle, Facebook's own analytics tool.
Facebook said it started pushing through the changes on Tuesday and that it would take a more proactive approach to finding and removing QAnon content rather than relying on people to report content.