KTLA

Facebook says it will remove QAnon conspiracy theory accounts from its platforms

FILE - In this May 14, 2020, file photo, a person carries a sign supporting QAnon at a protest rally in Olympia, Wash. Facebook said Tuesday, Oct. 6, 2020, that it will remove Facebook pages, groups and Instagram accounts for “representing QAnon.” (AP Photo/Ted S. Warren, File)

Facebook said Tuesday that it will remove pages, groups and Instagram accounts for “representing QAnon” — even if they don’t promote violence.

Enforcement of the new policy will begin Tuesday, but “this work will take time and need to continue in the coming days and weeks,” according to a news release.


Facebook has been criticized for not doing more to de-platform QAnon, the baseless conspiracy theory that paints President Donald Trump as a secret warrior against a supposed child-trafficking ring run by celebrities and “deep state” government officials.

The social network said it will consider a variety of factors to decide if a group meets its criteria for a ban, including its name, the biography or “about” section of the page, and discussions within the page, group or Instagram account.

Mentions of QAnon in a group focused on a different subject won’t necessarily lead to a ban, Facebook said.

Less than two months ago, Facebook said it would stop promoting the group and its adherents, although it faltered with spotty enforcement. It said it would only remove QAnon groups if they promote violence. That is no longer the case.

The QAnon phenomenon has sprawled across a patchwork of secret Facebook groups, Twitter accounts and YouTube videos in recent years. QAnon has been linked to real-world violence such as criminal reports of kidnapping and dangerous claims that the coronavirus is a hoax.

But the conspiracy theory has also seeped into mainstream politics. Several Republican running for Congress this year are QAnon-friendly.

By the time Facebook and other social media companies began enforcing — however limited — policies against QAnon, critics said it was largely too late. Reddit, which began banning QAnon groups in 2018, was well ahead, and to date it has largely avoided having a notable QAnon presence on its platform.

Twitter did not immediately respond to a message for comment on Tuesday.

Also on Tuesday, Citigroup Inc. reportedly fired a manager in its technology department after an investigation found that he operated a prominent website dedicated to QAnon. According to Bloomberg, Jason Gelinas had been placed on paid leave after he was identified on Sept. 10 by a fact-checking site as the operator of the website QMap.pub and its associated mobile apps.

Citi did not immediately respond to a message for comment on Tuesday.

The Associated Press contributed to this report.