At first glance, the videos don’t seem that different from anything else users might find on TikTok. There are a lot of memes, lip-syncing, and badly synchronised dancing. But the dancers are in combat gear, the lip-syncs are accompanied by M-16s, and the memes are about killing ‘alphabet bois’, slang for FBI and ATF agents. Welcome to the world of the Boogaloo, the bizarre extremist movement that, despite all attempts to stop it, is still spreading on TikTok.


American Civil War II?

At its core, this loosely organised movement is a mix of gun advocates and libertarians, united by their belief in a fast-approaching second civil war; the eponymous ‘Boogaloo’. Often seen wearing Hawaiian shirts and combat gear, groups of Boogaloo bois infiltrated Black Lives Matter protests last year in an attempt to use them as a catalyst for the conflict. Since then, Boogaloo has been linked to several cases of real-world violence, including the murder of a security officer and a sheriff’s deputy by Air Force sergeant Steven Carrillo. Carrillo, who was extremely active on Boogaloo Facebook pages, saw his attacks as the beginning of a revolution that would bring about this second civil war.

Unlike traditional extremist ideologies, Boogaloo is an almost entirely online movement. Even its name comes from an internet joke that riffs on the title of the 1984 film Breakin: Electric Boogaloo, with memes and ‘shitposting’ playing a central role in online Boogaloo communities. This overlap with wider internet culture has been key to Boogaloo’s growth, and nowhere is this more true than on the video-sharing app TikTok.

Bypassing TikTok’s censors

Since it exploded in popularity almost three years ago, TikTok has struggled to contain the spread of extremist material. ‘Far-right extremist movements are utilizing TikTok as an effective recruitment tool because TikTok’s audience is younger (and subsequently more vulnerable)’ said Olivia Little, a researcher for Media Matters for America. The app, which has been downloaded more than 2 billion times, has attempted to crack down on far-right extremism over the past year. However, even a cursory search of TikTok reveals a wealth of Boogaloo-inspired content, suggesting that the movement is largely avoiding content moderation.

As social media platforms cracked down on extremism following the January 6 Capitol insurrection, Boogaloo social media accounts began avoiding traditional hashtags such as ‘boogaloo’ and adopted variations like ‘big igloo’ and ‘big luau’ to avoid being banned. This strategy has proved extremely effective on TikTok, where posts using the ‘boogaluau’ hashtag have amassed over 55,000 views. One post, from an account with over 70,000 followers, features the account holder speculating about possible second civil war scenarios. Another account with 1.3 million likes joked about using stimulus checks to buy ‘tannerite (an explosive) and ammo’.

The #killdozer tag, which has 15 million views on TikTok, is also popular with Boogaloo adherents. Most posts that use this hashtag are not Boogaloo related, allowing Boogaloo accounts to ‘camouflage’ content by utilising references to Marv Heemeyer, a Boogaloo icon known as ‘killdozer’ who destroyed much of the Colorado town of Granby with a modified bulldozer in 2004. One account which posts under #killdozer also describes himself as an ‘Alphabet boy hunter’ in his bio. Their account is full of jokes about violent standoffs with the ATF and FBI, often jarringly accompanied with lip-syncs and pop songs.

Unlike some other social media sites, TikTok has shown little hesitation in taking down extremist accounts once they have been made aware of them. However, a search of the #fedboi tag reveals numerous Boogaloo accounts with ‘2.0’ or ‘v2’ in their usernames, suggesting that banned users are simply making replacement accounts. It’s not hard to see why these accounts are being targeted — posts under #fedboi toe the line between edgy humour and open violation of Tiktok’s content rules. One user who posts under this hashtag regularly uploads memes about killing ‘alphabet bois’ with tannerite and ‘claymore rumbas’, a common theme among posts using the #fedboi tag. What makes this account unusual is that it also includes a post titled ‘cracked chem 101’ with instructions on how to make thermite, an incendiary substance.

How do you curb a dangerous trend?

This is indicative of a dangerous trend amongst Boogaloo accounts, where it can be hard to differentiate between a joke and a legitimate call for violence. Researchers for the Digital Citizens Alliance and the Coalition for a Safer Web recently claimed they were ‘more likely to find domestic extremists sharing a capacity for violence’ on TikTok than on other platforms. Their report suggested that extremist movements such as Boogaloo have become adept at camouflaging their content in order to ‘trick’ AI overseeing content moderation. There is abundant evidence of this on TikTok, with Boogaloo accounts using hashtags such as #freedom, #liberty and even #worththewait to advertise a June 12 rally at ‘all-state capitols’. The report proposes a relatively simple solution to this problem; hire more human beings. It argues that, unlike most AI, trained content moderators possess the nuance to differentiate between ‘a dangerous militia member hijacking the term “patriot”, and someone who is a New England Patriots fan’.

In the aftermath of the report, TikTok said it would expand restrictions on certain hashtags, but it is unclear whether the platform is able or willing to tackle the deeper causes behind the spread of extremist material.

‘As we have reported, there is evidence to suggest that TikTok’s algorithm picks up and recommends far-right extremist content, which contributes to its ability to circulate so widely’, said Olivia Little. Media Matters for America found that TikTok’s recommendation algorithm was pushing users curious about conspiracy theories towards QAnon, anti-vaccination and anti-Semitic content. This suggests TikTok has a larger, more structural problem on its hands, one that can’t be solved by banning accounts and removing videos.

‘TikTok needs to consistently enforce their own policies about extremist accounts and content, work with organisations that deeply understand extremist movements, and learn from the failures of older social media platforms’, argued Olivia Little. Adding: ‘One-off takedowns aren’t effective unless they are part of a broader strategy’.