Have you noticed less QAnon content on social media lately?
Yes, it’s true that many online platforms have banned promoters of the right-wing conspiracy theory turned violent movement. But, many QAnon followers are still finding ways around those bans.
The reason you may not be seeing so much QAnon online isn’t because they’re not there. It’s because they’ve gone undercover. QAnon content is still spreading on mainstream social media platforms thanks to a number of tactics its believers are using to get around the bans.
Evading the bans
“QAnon believers excel at ban evasion,” Mike Rothschild, author of The Storm is Upon Us, a forthcoming book about QAnon, tells Mashable.
One prime strategy, Rothschild explained, has been deployed since Twitter became one of the biggest major social platforms to crackdown on the conspiracy in July 2020. When a QAnon believer’s Twitter account gets banned, they’d just sign up for a new account and let their fans on alternative, yet less popular social networks, like Parler, where to find them. They’d immediately rack up tens of thousands of followers as their fans followed them on a brand new Twitter profile.
“Many came back again and again, making dozens of accounts,” Rothschild says.
Months after Twitter banned QAnon on its service, Facebook followed suit and banned the conspiracy from its platforms in October 2020. But, by then, followers of the conspiratorial movement had already formulated strategies on how to get around such bans.
And they were doing so with help from Q, the anonymous user that started the whole conspiracy three years earlier.
“Q has specifically asked QAnon followers to ‘deploy camouflage’ by dropping all references to ‘Q’ and ‘QAnon,'” said Travis View, co-host of the popular QAnon Anonymous podcast, a show that tracks and criticizes the conspiracy, when we spoke in Oct. 2020.
QAnon believers quickly needed new keywords and hashtag campaigns in order to find other like-minded followers or new blood that could be brought into the fold. They started calling the conspiracy “Cue Anon” in order to evade social network algorithms detecting the term QAnon. Instead of typing “Q,” they would instead use “17,” a reference to the letter being seventeenth in the alphabet.
“The general feeling in the Q community was that Twitter and Facebook were where the battle was being fought, and that believers had to find ways to stay there,” Rothschild explained.
QAnon in the spotlight
As the pandemic and lockdowns and protests permeated throughout 2020, more and more people fell down the QAnon conspiracy rabbit hole. The movement had started three years earlier when the anonymous Q started posting conspiratorial pro-Trump messages on 4chan. But by 2020, the messages that had emerged from the dark corners of the internet were in the online spotlight.
The number of its adherents, who falsely believe that former President Trump is secretly waging a battle against a global cabal of Satanic baby-eating sex traffickers primarily made up of the Democratic Party and Hollywood elite, were exponentially growing.
At its height on Facebook in the summer of 2020, the number of QAnon-related Facebook Pages, Groups, and Instagram accounts were in the tens of thousands. On top of that, some of those QAnon Groups had hundreds of thousands of Facebook users as members.
Then, when a violent Trump-supporting mob stormed the Capitol building on Jan. 6, 2021 to stop the certification of now President Joe Biden’s victory in the 2020 election, images of the rioters were broadcast on televisions around the world. People wearing QAnon shirts, an individual dressed in viking garb with red, white, and blue paint on his face known as the “QAnon Shaman” were all being covered on major news networks, firmly thrusting QAnon right into the mainstream.
Following the events of Jan. 6 which left five people dead, Twitter announced that it had suspended 70,000 accounts “primarily dedicated to sharing QAnon content.” The company also suspended high-profile QAnon influencers, and right-wing personalities who made a name for themselves by promoting the conspiracy, such as Michael Flynn and Sydney Powell.
In addition, Twitter suspended 8kun owner Jim Watkins and his son, Ron Watkins, who also acted as the site’s administrator. 8kun, the website which is run by this father-son duo, is where Q dropped their messages to QAnon’s followers after moving from 4chan in 2017. (Recent evidence also highly suggests that Ron Watkins may very well be the person behind Q.)
Remember: Twitter prohibited QAnon content from its platform back in July 2020. How were so many accounts still spreading these conspiracy theories? Even though most major social media platforms had some policy cracking down on content adjacent to QAnon beliefs, such as falsehoods surrounding the November presidential election being stolen from Trump, it’s obvious the QAnon bans didn’t work.
It’s unclear if this was mostly due to the ban evasion tactics or lack of enforcement from the platforms, but it’s likely a combination of both.
Rebranding QAnon
If you came across people in your city protesting with “Save Our Children” signs during the past year, you ran into QAnon believers.
In an effort to conceal the QAnon name and its purpose — supporting Donald Trump and attacking his political enemies — believers in the conspiracy started to organize under the guise of a movement that was against trafficking children.
Organizations and charities which have long fought to end human trafficking have rebuked QAnon, saying their actions have actually hurt the efforts to save children from abuse.
“QAnon co-opted the iconography of ‘save the children’ organizations, knowing that Twitter and Facebook would never ban those [types of] groups,” explains Rothschild. “The result was the massive anti-trafficking marches, full of Q acolytes waving signs with Q hashtags, with some people not even knowing what Q was, but completely supporting it.”
COVID-19 anti-vaccine conspiracies are another avenue where QAnon is trying to keep their movement alive without the baggage that comes with Q.
Sure, there’s already plenty of overlap between anti-vaxxers and QAnon followers’ beliefs. However, social media companies have notoriously lax responses to health falsehoods, specifically. For example, whereas QAnon conspiracy theorists are removed from the social networks, many anti-vaxxers simply receive a fact check label affixed to their falsehoods, which still allows dangerous health disinformation to flow on their platforms. In fact, a recent report found that 12 individuals are responsible for the majority of the anti-vaccine disinformation on the major social media platforms. The report even identifies exactly who they are, yet the majority of them remain on the platforms. (Currently, 10 out of 12 still have active pages on Facebook at the time of publishing this story.)
Fly-by-night conspiracy theorists
QAnon’s newest workaround takes a page out of Snapchat’s book: ephemeral content.
Knowing their content and even their whole account will probably get banned, QAnon followers will post content, let it spread throughout the platform, and then delete the content before the social media platform algorithms can detect it and take action.
A new report from CNET details how this strategy was being deployed on YouTube. While the QAnon channels mentioned in the report have since been removed by YouTube, the strategy worked. These channels were only deleted once reported on by the outlet. The channels were able to acquire tens of thousands of subscribers with content that would be deleted after a few days. The content itself, some of which clearly violates YouTube’s policies on inciting violence, would also amass hundreds of thousands of views, before being deleted. The channels would then repost content and repeat the process.
Similar strategies with “fly-by-night” or throwaway accounts have also been recently deployed by white nationalists on platforms like Twitter, says Jared Holt, a resident fellow on domestic extremism at Atlantic Council’s Digital Forensic Research Lab.
For example, earlier this year, newly opened Twitter accounts started posting details about the America First Political Action Conference (AFPAC). It would be the first big white nationalist event since the Jan. 6 storming of the Capitol. Nick Fuentes, who led the event, would share these tweets in order to promote the event. If action was taken against these accounts for breaking Twitter’s policies, the tweets and possibly the accounts that posted them would be removed. But by then, the content and accounts already acted as a shield for Fuentes. His account would remain unscathed and the event was still able to be promoted to his large following.
“Once figures are banned from mainstream platforms, we’ll often see them attempt returns with accounts they assume will be booted,” Holt tells Mashable. “Creating accounts built around events or campaigns is among the most common of that pattern.”
Holt says that some QAnon influencers have also attempted to replicate these tactics.
What’s next?
How can you ban QAnon if it never existed at all?
One of the most prevalent strategies from QAnon adherents is to pretend that’s the case.
Q last posted a drop for his followers in December of last year. However, in one of those final communiques, Q declares “there is Q, there are anons, there is no QAnon.” A search on social media platforms for the phrase pulls up many posts from people spreading QAnon conspiracies yet are simultaneously claiming it never existed to begin with.
“The entire movement immediately embraced the idea that the term ‘QAnon’ was a mainstream media creation, despite Q having used the term many times, and other Q believers putting it all over merchandise since late 2017,” Rothschild explains to me. “The most popular book about the movement by believers is literally called ‘QAnon: An Invitation to the Great Awakening.'”
Denial aside, many QAnon influencers who get booted off the mainstream social networks have moved on to alternative platforms like Telegram, where they can spread their conspiracies and disinformation without fear of consequence. The follower base may be smaller, Rothschild says, but it’s a “very fervent audience.”
However, there’s only so much one can get out of just speaking to the choir. While some might be content with Telegram, others will continue to seek out ways to get back on the Facebooks, Twitters, and YouTubes of the internet.
“Having access to large audiences is crucial to extremist movements’ goals of growth and propagandizing,” Holt tells me. “Alternative platforms often don’t leave them satisfied.”