Facebook bans QAnon conspiracy theory accounts across all platforms

Demonstrators from conspiracy theorist group QAnon protest in Los Angeles in August.
Demonstrators from conspiracy theorist group QAnon protest in Los Angeles in August. Copyright KYLE GRILLOT/AFP or licensors
Copyright KYLE GRILLOT/AFP or licensors
By The Cube
Share this articleComments
Share this articleClose Button
Copy/paste the article video embed link below:Copy to clipboardCopied

Facebook says it will ban QAnon accounts across all platforms including pages that "do not contain violent content".

ADVERTISEMENT

Facebook has announced it will ban groups and accounts that openly support the baseless conspiracy theory, QAnon.

The company announced on Tuesday that it will remove "any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content".

Administrators of banned Facebook groups will also have their personal accounts disabled.

Facebook said the change is an update on their policy that initially only removed accounts related to the conspiracy theory that discussed violence.

The social network had already cracked down on QAnon and other "militarised social movements" to prevent them from using the platform to promote and organise their actions.

Hashtags such as #savethechildren - which have been hijacked by the movement - now also direct Facebook users to credible child safety sources.

The social media giant removed over 1,500 pages and groups for QAnon in the first month after introducing the policy in August.

It has also taken steps to reduce the reach of thousands of Instagram accounts by limiting recommendations, preventing adverts, and reducing their appearance on social media newsfeeds.

But under the new policy, Facebook has placed emphasis on profiles which "represent" and openly support the theory, rather than on individual content. Accounts which only refer to QAnon in a group focused on different subject matter won't necessarily lead to a ban.

The company has also cautioned that "this work will take time and need to continue in the coming days and weeks".

"Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports."

Analysts say the move is a significant step but has been long overdue.

"This is a step we have been asking Facebook’s top executives to take for months as part of Stop Hate for Profit," said Jonathan Greenblatt, CEO of the Anti-Defamation League.

"Now that they have announced that they will treat the QAnon ideology like the very real threat that it is, we hope that they will follow up with some modicum of evidence showing how the ban is being enforced and whether it is fully effective".

"We hope that this is a sincere effort to purge hate and antisemitism from their platform, and not another knee-jerk response to pressure from members of Congress and the public".

Critics have previously accused Facebook of not doing enough to combat misinformation and online hate.

In July, Twitter banned thousands of QAnon-related accounts and said it would stop recommending content linked to the conspiracy theory to help prevent "offline harm".

Meanwhile, Reddit began removing groups in 2018 and has largely avoided a notable QAnon presence on the platform.

ADVERTISEMENT

What is QAnon?

Followers of the unfounded QAnon conspiracy theory believe that U.S. President Donald Trump is a secret warrior against a supposed child-trafficking ring run by corrupt celebrities and "deep state'' government officials.

Supporters rely on anonymous posts from a user known as "Q" on an extremist online messaging forum.

Mentions of the theory have significantly increased ahead of the U.S. Presidential elections, and several Republican candidates for Congress have openly discussed QAnon.

The group has also spread many other unfounded theories, such as the idea that the coronavirus pandemic is a conspiracy to control people with vaccines and 5G technology.

When asked about QAnon supporters in August, President Trump described them as "people who love our country".

ADVERTISEMENT

Facebook says they have found that supporters of the many conspiracy theories are shifting from one topic to another to constantly attract new audiences.

"While we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real-world harm", the company said.

This includes claims that wildfires on the U.S. west coast were started by certain groups, and this diverted the attention of local authorities away from protecting the public.

"We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement."

"We expect renewed attempts to evade our detection, both in behavior and content shared on our platform, so we will continue to study the impact of our efforts and be ready to update our policy and enforcement as necessary."

ADVERTISEMENT
Share this articleComments

You might also like

QAnon: YouTube bans conspiracy theory content that justifies real-world violence

What is QAnon and should Europe be worried? | #TheCube

European elections: Top tips to avoid misinformation