By: Arron Williams
May 29 2023
March 27, 2023, saw the 129th mass shooting of the year so far in the U.S., prompting conspiracy theorists to claim that “the government is manufacturing a civil war.” Latching onto the latest tragedy in Nashville, TN, enabled conspiracists to push various narratives, suggesting that a New World Order is taking over and attempting to start a civil war; that there is a secret plot to eradicate Christianity; and that the shooting which took place was one of many false flag operations, among others. This alone makes for grim content, yet it only scratches the surface of the conspiracies spread by one specific YouTube account, “A Call for An Uprising.”
Discovering the channel Call for An Uprising (Call) and diving down a rabbit hole of antisemitic conspiracies was not the expected outcome of viewing an innocuous live reaction stream by user MoistCritical on Twitch in December 2022. The streamer was hosting a Reaction stream, in which viewers send video recommendations, which the streamer then views live and, well, reacts.
Often when genuine harmful conspiracy content is found in these streams, it tends to be from smaller YouTube channels. However, when one chat member recommended a YouTube channel called “A Call for an Uprising,” it warranted a closer look. At the time, a user mentioned one of Call’s videos in the chat, which was swiftly closed by the streamer after it turned out to be a transphobic rant. A further investigation of the YouTube channel by Logically Facts, has uncovered rampant antisemitism, transphobia, and all kinds of conspiratorial beliefs.
Although Call’s conspiracy theorizing is not unique, it was one of the bigger channels on YouTube promoting conspiracies as interconnected and had high traction. Some of the more recent conspiracy theories found on the channel included the claim that the earthquake in Turkey in February 2023 was an operation by the U.S. and caused by the Alaska-based High-frequency Active Auroral Research Program (HAARP), a theory that has been consistently disproved. The channel also claimed train derailments in Ohio and elsewhere were planned events and government-waged chemical warfare. These claims have been debunked widely, and there are several factors considered to have contributed to the Ohio case.
Call frequently asserts that crises and disasters are psyops or false flag operations, including the Waco Siege in 1993, the Oklahoma City Bombing in 1995, and, more recently, the Nashville school shooting. These conspiracies frequently arise but are baseless: claims that the Uvalde school shooting in 2022 was a false flag were comprehensively debunked. Infamous conspiracy theorist and Info Wars host Alex Jones owes approximately $1.5bn to the families of the 2012 Sandy Hook shooting after claiming it was a hoax and being successfully sued in response.
Call has also spread conspiracies and disinformation about COVID-19, vaccines, 5G, and climate change. All of this forms part of a bigger, broader issue in online spaces where conspiracy theories and disinformation bleed through and stretch across different online platforms. These platforms possess different modes of viewer interaction and methods of content moderation, allowing conspiracies to be more widespread and draw in more traction. This is made more significant due to a recent rise in antisemitism which overlaps with conspiratorial discourse and an overt shift to Christian moral panic narratives. Call himself is an example of a conspiracy theorist that fits into this broader conspiracy paradigm and, through it, alongside antisemitism, spreads transphobic and homophobic accusations.
Content moderation online faces various issues, and conspiracy theorists constantly adapt and change to work around the restrictions of large social media platforms. Each platform has different moderation challenges and policies, and the situation with Call showcases some of the limitations and issues with moderation.
In Twitch’s example, where we first discovered Call, it relies on a variety of moderation styles to manage harmful content on its platform. The terms of service has rules about what can be streamed, suspending users if a stream violates this policy. Auto-moderated services also help to censor hate speech or blacklisted words in a Twitch chat. However, outside this scope, Twitch chat moderation relies on the streamer themselves or moderators who are streamer-selected, non-Twitch-affiliated individuals ranging from trusted chat members to just plain old friends: there is no vetting process.
While this benefits streamers in cultivating their desired space or vibe, it also relies on moderators’ knowledge and awareness to know and evaluate what constitutes harmful content, causing moderation quality to vary significantly. This is also impacted by size: in larger, more active chats, messages move faster, and recommendations can not always be verified before a streamer engages with it, which is how Call’s channel was inadvertently promoted, exposing the entire stream to their content. The streamer selected one random recommendation by a chat member; even when they exercise caution, harmful content can slip through.
Twitch is not alone in lacking content moderation, evidenced by Call’s long-running existence on YouTube.
The “A Call For An Uprising” channel had over 500,000 subscribers, with average viewership per video at around 40,000 - 50,000, and was only banned by YouTube in March 2023, despite being active and spreading harmful conspiratorial content for seven years. Before his main account ban, his routine spreading of misinformation and harmful content was addressed in videos by other YouTubers over the years. Logically Facts spoke to one, Hannah Reloaded, to get a better understanding.
Hannah told Logically Facts that while Call’s subscriber and viewership count was high, she suspects the viewer count was likely inflated by bots. Call has a website and forum, but, as Hannah pointed out, while this has boosted and coordinated viewership, the number of people on the forum is significantly lower than the viewer counts on his channels, illustrating that bots are another part of the moderation problem.
Forbes reports that 66 percent percent of all links shared on Twitter are shared by bots, which influence what is seen on the internet. 30 foreign governments, including Russia, use bots to distort information online and promote specific narratives. Spam bots can increase the popularity of certain individuals and have also been found to spread misinformation. While measures are constantly taken to moderate and decrease the influence of malicious bots, they can still cause harmful topics to gain more traction than they may have naturally.
Conspiracy theorists often avoid content moderation policies or bans by creating alternate accounts (“alts”), which are backups promoted on a user’s main channel. Logically Facts noted that Call ran at least eight alts alongside his main channel. According to Hannah, some of these channels initially avoid conspiracy content; for example, one pretended to be a gaming channel. While some of Call’s channels focused on gaming content, another pretended to be a trans content creator – although used to reinforce transphobic stereotypes. YouTube’s system works in such a way that advertisers bid to have their content shown on channels with the right hashtags, to reach a monetization threshold. Call creates these "gaming" channels, then switches back to conspiracy content as soon as he's hit that threshold. Hannah stated that he has done this at least four or five times; however, these alternate channels get banned as soon as they hit about 80k subscribers. In March, alongside his main channel ban, most of his alternate channels had also been deleted.
In response, Call made two new YouTube channels and a Rumble channel. However, his viewership has been drastically reduced, although he claims this won’t stop him from trying to rebuild. Call is likely neither the first nor last to use this strategy, and, if undetected, this can pose significant difficulties in moderating harmful content, especially if they are created as “gaming channels” with non-conspiratorial content initially. Logically Facts reached out to YouTube for comment to learn more about their reasons behind the removal but did not hear back.
Call also uses a “self-destruct” approach to uploading content, where uploaded videos are deleted a few days after publishing, a tactic often used by QAnon creators to evade violations and policies on YouTube and remain undetected. This tactic aims to “game” the platform's rules by removing content that violates policies before it can be found and cause channel suspensions. In Call’s case, his channel would often look dead and inactive or recently reactivated with at most one or two recent videos visible, before deleting the videos five days later, thus evading detection. Call’s tactics hint at a broader problem of adaptation; while he is not unique, his situation shows how conspiracy theorists online continuously adapt strategies to evade moderation policies.
His tactics of content moderation are not all that has allowed Call to grow his audience: he has also rode the wave of the recent spike in online antisemitism. Research has shown that this is an increasing trend in online discourse, and following Twitter’s acquisition by Elon Musk there has been a major spike in antisemitic posts. These include antisemitic comments from Kanye West in late 2022 and a surge of TikTok posts about the Neo-nazi documentary “Europa the Last Battle” gaining high traction. This uptick allows conspiracy theorists like Call to use the issue to extend his reach. Call has supported West’s comments, but claimed it was likely a psyop with West being a puppet. Beyond his response to Kanye, Call often asserts that “Satanists and Jews are the same.”
Call also feeds into a form of moral panic prevalent in conspiracy theories and overlapping with Western politics in an evolution of the Satanic Panic that started in the 1980s and has recently gained a resurgence through QAnon. Call asserts that a “judgment day” is coming, and all his previously mentioned conspiracies are related to a satanic deep state with an agenda to eradicate Christianity. This moral panic is further manifested through bigotry and fear of moral decay with homophobia, transphobia, and antisemitism to blame.
Call is part of a wider problem online of rising antisemitism and conspiracy theories being pushed alongside the rhetoric of a Christian moral panic and exacerbated by issues with the moderation of large-scale online platforms. These conspiracies continue to circulate online and Call’s situation shows a variety of tactics used to evade content moderation and further spread these claims. Importantly, these conspiracies have actual consequences, and adherents target minorities and vulnerable people, including the LGTBQ+ community, by associating them with what is considered “Moral decay.” Resolving this issue is not an easy task, and with recent layoffs of staff at Twitch, as part of wider Amazon layoffs, this current method of moderation will likely resume without much to address the problem of harmful or conspiratorial messages or links spreading through chat.