In fact, the QAnon conspiracy seems to have largely disappeared from the big social media sites. But that is not exactly the case.
These days, popular QAnon phrases like “big wake up,” “storm,” or “trust the plan” are less common on Facebook. Facebook and Twitter have deleted tens of thousands of accounts devoted to baseless conspiracy theory, depicting former President Donald Trump as a hero fighting in a secret battle against a sect of devil-worshiping pedophiles who dominate Hollywood, the large companies, the media and the government.
Gone are the huge “Stop the Steal” groups that spread lies about the 2020 U.S. presidential election. Trump has also disappeared, Twitter has been permanently banned and his post on Facebook is suspended until 2023.
But QAnon is far from over. Federal intelligence officials recently warned that their followers could commit more violence, such as the deadly January 6 Capitol uprising. At least one open supporter of QAnon has been elected to Congress, Marjorie Taylor Greene. In the four years since someone called himself “Q,” he started posting enigmatic messages on Internet discussion forums, QAnon has grown.
QAnon now includes various conspiracy theories, from evangelical or religious angles to alleged pedophilia in Hollywood and the Jeffrey Epstein Scandal, said Jared Holt, a resident member of the Atlantic Council’s DFRLab that focuses on domestic extremism. “Q-specific things are declining,” he said. But the worldviews and conspiracy theories that QAnon absorbed still exist.
Freely uniting these movements is a general distrust of a powerful, often left-wing elite. Among them are providers of anti-vaccine fakes, adhering to Trump’s “Big Lie” that the 2020 presidential the elections were stolen and believers in almost any other worldview convinced that a gloomy Kabbalah controls things in secret.
For social platforms, tackling this faceless, changing and increasingly popular mindset is a much more complicated challenge than they had dealt with in the past.
These ideologies “have consolidated their place and are now part of American folklore,” said Max Rizzuto, another DFRLab researcher. “I don’t think we’ll ever see it go away.”
Online, these groups are now becoming a background. When Facebook groups previously openly referred to QAnon, the groups are now titled, “Since I didn’t find this in the so-called MSM,” a page that refers to “mainstream media” that has more than 4,000 followers. It includes links to Tucker Carlson clips from Fox News and articles from right-wing publications such as Newsmax and the Daily Wire.
A sticker referring to the QAnon slogan is seen on a truck participating in a caravan convoy in Adairsville, Georgia, USA on September 5, 2020 [File: Elijah Nouvelage/Reuters]
Topics range from alleged rampant crimes to unfounded claims of widespread electoral fraud and a “direct war on conservatives.” These groups aim to delve deeper into followers by directing them to more information about less regulated sites like Gab or Talk.
When DFRLab analyzed more than 40 million occurrences of QAnon slogans and social media-related terms earlier this year, it found that its presence on major platforms had declined significantly in recent months. After highs in mid to late 2020 and briefly on January 6, QAnon’s phrases have evaporated in much of the main sites, DFRLab found.
So while users may not post wild conspiracies about Hillary Clinton drinking child’s blood, instead they may repeat rejected claims that vaccines can alter your DNA.
There are several reasons to reduce the Q conversation: Trump loses the presidential election, for example, and the lack of new “Q” messages. But it seems that the most important factor has been the repression of QAnon on Facebook and Twitter. Despite well-documented errors that revealed irregular application, the exile appears to have worked largely. Nowadays, it’s harder to come across cheeky QAnon accounts on major social networks, at least from publicly available data that doesn’t include, for example, hidden Facebook groups and private messages.
While QAnon’s core groups, pages, and accounts may have disappeared, many of its followers remain on the big platforms; only now do they camouflage their language and water the most extreme principles of QAnon to make them more enjoyable.
“There was a very, very explicit effort within the QAnon community to camouflage their language,” said Angelo Carusone, president and CEO of Media Matters, a liberal research group that has followed QAnon’s rise. “So they stopped using a lot of the code, the triggers, the keywords that caused the kind of application actions against them.”
Other dodges may also have helped. Instead of appearing Q slogans, for example, for a time earlier this year, supporters wrote three asterisks next to their name to indicate adherence to conspiracy theory. That’s a nod to Trump’s former national security adviser Michael Flynn, a three-star general.
Facebook has said it has deleted some 3,300 pages, 10,500 groups, 510 events, 18,300 Facebook profiles and 27,300 Instagram accounts for violating its anti-QAnon policy. “We continue to consult experts and improve our application in response to the evolution of the damage, including recidivist groups,” the company said in a statement.
But the social giant will still reduce the number of people posting about QAnon’s fluidity, citing experts who warned that banning individual Q adherents “can lead to greater social isolation and danger,” the company said. Facebook policies and the response to QAnon continue to evolve. Since last August, the company said it has added dozens of new terms as the movement and its language have evolved.
Meanwhile, Twitter said it has taken constant action against activities that could cause offline damage. Following the January 6 uprising, the company began permanently suspending thousands of accounts that it said were “primarily” dedicated to sharing QAnon’s hazardous material. Twitter said it has suspended 150,000 such accounts to date. Like Facebook, the company said its response is also evolving.
But the crackdown may have come too late. Carusone, for example, noted that Facebook banned QAnon groups linked to violence six weeks before it banned QAnon more broadly. Indeed, this gave notice to the fans to regroup, camouflage themselves and move to different platforms.
“If there was ever a time for a social media company to position itself on QAnon content, it would have been like months ago, years ago,” Rizzuto of DFRLabs said.