CW: hate speech (antisemitism)

The giddy aggression of the mob that violently stormed the Capitol building in Washington, USA on the 6th of January was palpable through a TV screen. As the ‘domestic terrorists’ scrambled up walls, smashed windows and stole the U.S. Speaker of the House’s lectern, they no doubt felt a divine power, a hate-driven high, but for many of the more radical participants the excitement also came from being part of an internet community materialising into a real life mob. As CNN correspondent Elle Reeves said, “When this huge swarm of people who’ve been active online finally get to meet each other in person… there’s this thrill of it and it’s very high energy”. 

“[It is] hard to overstate how online this mob is,” read one tweet responding to the events on the 6th. The author, New York Times technology columnist Kevin Roose, attached an image of a mob-goer in a Pepe the Frog mask, its bulbous green shape standing out in a sea of red MAGA caps and cameras. Pepe, a cartoon frog that first appeared in 2005, has now become a mascot of the far-right online (against the wishes of its creator), proliferating in memes on platforms such as 4chan, 8chan and reddit. Where is the line between viral jokes, threats, symbolic violence and real-life violence? And how did memes come to be used to spread insidious underlying messages and hate speech?

The Anti-Defamation League describes ‘alt-right’ as a group “who regard mainstream or traditional conservatives as weak and impotent, largely because they do not adequately support white racial interests, or are not adequately racist or antisemitic”. Built on anger and fueled with prejudice, the potency of hatred in alt-right ideologies makes it an exhausting world to research. Online, the cultural identities of various neo-fascist, neo-Nazi, xenophobic or chauvinist groups merge, change and are reified in insular internet communities. Because of the nature of these communities and of changing alt-right internet lexicons, identities, symbols and platforms, it can become difficult to distinguish between ultra conservative and ‘alt-right’ ideology. 

The spread of alt-right ideologies has been massively galvanised by the nature of modern communication platforms. From the early 2000s far-right communities have proliferated on message boards like 4chan and 8chan, anonymous sites on which bigotry can be nameless, faceless, and often consequence-free. Like-minded political extremists form communities that either construct or reinforce group identities. Dr Julia R. DeCook describes the Proud Boys, a far-right neo-fascist male-only organisation, as “distinct from other neo-conservative movements because of their heavy strategic use of social media” both for recruitment and identity reinforcement. It is certainly significant that the Proud Boys founder Gavin McInnes, the co-Founder of Vice magazine, has significant expertise in mass communication techniques. Unfortunately, this alt-right pull is coming from all directions. Though we may think that insidious ideologies such as white supremacy and antisemitism are confined to the dark enclaves of the internet, the far-right have been active on all mainstream media fronts, Instagram, Twitter, Facebook and more recently TikTok. Sometimes, Nazi propaganda is only a hashtag away. 

The recruitment technique is simple, and targeted particularly at young ‘outsiders’ – online communities are designed to provide them with an identity, a community and a purpose. Gil Noam, an associate professor of psychology at Harvard Medical School and McLean Hospital, argues that extremist recruiters certainly understand that a child at this age [between 11 and 15] is more likely to respond to the pull of community and a sense of purpose, even if they don’t readily identify with a group’s core message. With multiple content formats, such as text, video and audio, the internet has become the most user-friendly political medium for young people. Online platforms have the potential to mould a young person’s understanding of the world, as Heidi Beirich of the Southern Poverty Law Centre said, “Tiktok panders to children”. Innocent content and propaganda are often indistinguishable to many young people, particularly when they are presented in the same online environment. “They see cool videos, then they see racist things or content calling [white supremacist mass murderer] Dylann Roof a hero, and they’re going to end up going down a really bad path” she adds. An anonymous author writing in the ‘Washingtonian’ magazine alleged that her son, struggling socially, “found people to talk to on Reddit and 4chan”. The author’s son was allegedly radicalised by these forums at the age of 13, and that he was so active on his favourite SubReddit that the other group leaders, unaware of his age, appointed him a moderator. All over the alt-right online community ages mix, and so does truth and invented reality. 

One of the adult Trump supporters outside the Capitol on the 6th of January was wearing a jumper depicting Biden as a postman working for ‘FraudEx’ that had been allegedly designed by a high-school student called ‘Shaker’ who later gloated about this on his popular TikTok account, which had 86.3K followers at the time of writing this article. On Instagram, ultra-conservative accounts with thousands of followers are run by Americans in their early or mid teens, who make memes about the ‘fraudulent’ election or anti-feminism. While these accounts have no explicit neo-Nazi content, you can’t help but wonder if these individuals would have such radical viewpoints at such a young age had it not been for their exposure to radical politics online, and the momentum they get from their thousands of supporters. It is easy to see how such ultra-conservatism at such a young age might be a stepping stone to a more radical ideology and to violence. 

It is also important to consider how these websites function. Personalisation is highly significant; algorithms help to land every user in an echo chamber, a bubbled experience making online communities firmly insular. As a result of this, the alt-right is able to create its own unquestioned reality, built from memes, propaganda and opinions, questioned only by some supposed ‘trolls’ in the comments, quickly dismissed as ‘liberals’ or ‘normies’, or not present at all on private accounts and encrypted “Telegram” messages. Thus Proud Boys and other extremist groups represent reality and current affairs on their own terms, and the line between truth and invention is completely blurred. Arguably, the rioters at the Capitol represent a group driven to violence by invented, fabricated realities – in this case the claims of ‘fraud’, with no real evidence found thus far, asserted by Donald Trump and disseminated online by republicans and the alt-right. Despite this example involving more moderate republicans and not just the alt-right, Elle Reeves argues that there are many more ‘regular people’ now who believe extreme false realities, and that as a culture we have not grappled “with the way social media is a brainwashing machine.”.

Not only is it brainwashing, it is addictive. This is potentially the first time in history that people, especially young people, are becoming addicted to their source of propaganda. Throughout history propaganda has been administered in nefarious and subtle ways, through movies, in children’s textbooks, in advertising, but never before have we been so unable to peel ourselves away from it. Joen Koestsier’s description of TikTok in Forbes as ‘digital crack cocaine’ becomes particularly terrifying when you start to consider the political radicalisation that happens on these platforms, and the bitesize, digestible propaganda they offer users. 

It should be said that all the mainstream social media sites monitor and censor their platforms, and frequently take down content or block accounts. But with a changing lexicon, and temporary or private posts, no site can be Nazi-free. Or not successfully so far. 

In the context of the alt-right’s world view, its hatred, its anger and its violence, it is at first difficult to see how memes fit in. However, the alt-right have harnessed memes and humour as one of the most powerful tools of reinforcing their own identity and converting others. So there are definitely Proud Boys laughing online, and they’re laughing together. Dr Julia R. DeCook has described memes as serving “as a way of establishing cultural capital”. Symbols and images such as Pepe are shared and reused until an alt-right language is formed, and it is changing all the time. By the time ‘normies’ catch on, new terms and symbols have been rolled out. Alt-right memes strengthen their own sense of ‘in-groups’. Humour blurs the line between truth and reality, giving racist, sexist and ultranationalists a defence: when Proud Boys circulate ‘hunting permits’ for killing Anitifa members (the left-wing anti-fascist and anti-racist group), they were just a bit of ‘fun’. 

The Style Guide for the Daily Stormer (a neo-nazi alt-right blog) that was leaked a few years ago offers a painful insight into how the alt-right intentionally blur humour and hate speech online. Indeed, this strategy is presented with ironic clarity. The document instructs that ‘the unindoctrinated should not be able to tell if we are joking or not,’ before adding ‘this is obviously a ploy.’ With humour used as a means to soften indoctrination, the Daily Stormer sets out to pack its message “inside of existing cultural memes and humor” which “can be viewed as a delivery method. Something like adding cherry flavor to children’s medicine.” The reference to administering medicine to children is particularly chilling, and presents the insidiously manipulative force of the alt-right, moulding what they believe to be malleable, vulnerable minds. 

In these environments users are engulfed by alt-right memes and alt-right commentators, leading them to become both desensitised and indoctrinated. Worst still, humour comes with the underlying suggestion that the implied message is a given, lending alt-right prejudices a semblance of credibility. Again the Daily Stromer epitomises this where it says “generally, when using racial slurs, it should come across as half-joking – like a racist joke that everyone laughs at because it’s true”. ‘Poe’s law’, an adage of internet culture, states that without a clear indicator of the author’s intent, it is impossible to distinguish between real expressions of extremism and satirical expressions of extremism. Alt-right groups thus alternate between humour and hate speech, and between ‘symbolic’ violence, and real violence, ultimately blurring the line between the two. 

Memes also work on a sliding scale, coaxing viewers slowly into an ideology that they might never have previously identified with. As the Style Guide explains “The reader is at first drawn in by curiosity or the naughty humor, and is slowly awakened to reality by repeatedly reading the same points.” Initially, a young person might be smirking about questionable or ‘politically incorrect’ content, brushed off as ‘harmless satire,’ but are eventually guided into an internet space in which islamophobia, misogyny or antisemitism is explicit. The viewer is gradually desensitised, and slowly radicalised. 

What the storming of the Capitol has reminded us is that the seeds of violence, sown by the alt-right throughout the internet, do not stay embedded harmlessly in the soil. The Washington Post has described how the Capitol siege was planned online, with conversations in far-right forums explicitly discussing how to storm the building, handcuff lawmakers with zip ties and disrupt the certification of Joe Biden’s election. More subtle and in many ways more dangerous, memes posted by alt-right individuals have also proved to be a stepping stone to real life violence. A report written by Maura Conway, Ryan Scrivens and Logan Macnair revealed that the man who drove his car into counter-protestors at the Charlottesville ‘Unite the Right’ rally in 2019 had shared two memes on Instagram involving a car driving through protesters, one attaching the line “You have the right to protest, but I’m late for work.” This is truly terrifying content. According to the report, this meme, along with many others, was generated on 4chan and then spread to more mainstream sites, such as Instagram. Thus, this insidious alt-right propaganda works like a parasite, only aided in its spread by the sharing and reposting made effortless by social media machinery.

So what is the answer? It isn’t necessarily censorship. 

Last week, Trump was permanently suspended from Twitter, along with some of his supporters such as Sidney Powell and Michael Flynn. Some QAnon accounts were also among the accounts culled as a response to the storming of the Capitol. Jack Dorsey, Twitter’s CEO justified this decision in a lengthy series of tweets, arguing that “offline harm as a result of online speech is demonstrably real” and that it was a necessary step in order to prevent the president from inciting further violence. The social media company went into great detail in a blog post as to how the President had spread false information and can be seen to have provoked violence prior to and in the aftermath of the Capitol storming. Trump’s Instagram and Facebook accounts have also been suspended indefinitely. 

Unfortunately however, whilst censoring Trump solves an immediate problem of preventing further calls to violence, censorship has never been a permanent solution to the spread of bigotry and hate speech online. Many racist Facebook and Twitter users that are banned from the platforms, such as Faith Goldy or Tim “Baked Alaska” Gionet, simply move to a newer platform. For Goldie and Gionet it was TikTok, for others it was the ‘free speech’ platform Parler. American republican politician Ted Cruz said that Parler “gets what free speech is all about”; it is no surprise that this is one of the sites on which the Capitol rioters planned their attack on the Capitol. Parler CEO John Matze, when interviewed by Kara Swisher in the aftermath of the storming of the Capitol admitted that he doesn’t “necessarily monitor a lot of this stuff. [He] participate[s] and watch[es] Parler just as anyone else does”, with a random ‘community jury’ of users instead having responsibility for monitoring content that likely isn’t too far from their own political ideology. 

Expulsion from mainstream sites simply leads to insular far-right communities migrating to sites around which they can construct thicker walls. Parler jumped to number one on the app store following Trump’s removal from Twitter, and while it has recently been booted off Apple, Google and Amazon, it is ultimately just one alternative. Alt-right communities know the internet well, and will always find a place to congregate. Telegram, dubbed ‘terrorgram’ as well as Gab, Rumble and Newsmax add to the list of alternatives – this cycle has become an arms race. 

On Twitter, in response to Trump’s removal from the site, social media site Gab posted a meme depicting Pepe as a boat in a storm, referencing Noah’s ark. The underlying message was that far-right online communities will prevail in the face of ‘adversity’ and find another vessel. Maybe it will be a new platform created by Donald Trump, or maybe by a teen TikToker online. 

Artwork by Mia Sorenti


For Cherwell, maintaining editorial independence is vital. We are run entirely by and for students. To ensure independence, we receive no funding from the University and are reliant on obtaining other income, such as advertisements. Due to the current global situation, such sources are being limited significantly and we anticipate a tough time ahead – for us and fellow student journalists across the country.

So, if you can, please consider donating. We really appreciate any support you’re able to provide; it’ll all go towards helping with our running costs. Even if you can't support us monetarily, please consider sharing articles with friends, families, colleagues - it all helps!

Thank you!