“Dollars” is modest when asked what impact his forum has had on visitors from 4chan or 8chan. “I think that a solid 15% might learn something, think twice.”
He gets a little more bullish: “Maybe as high as 30%.”
Dollars, who asked to remain anonymous for his own safety, is a veteran of some of the most contentious sites on the internet. For more than 15 years, 4chan and its offspring have become renowned, and eventually infamous, for occupying prime internet real estate. The image boards offer all manner of community for millions of young men (and a few women) around the world. From porn to politics, anime to the occult, the image boards allow users to chat with each other and swap images, all anonymously.
4chan and the other “chans” are interwoven with the chaotic history of the internet. They are where the activist group Anonymous first began. They churned out cat memes and child pornography alike. 4chan was the birthplace of the QAnon conspiracy movement, while 8chan has been instrumental in growing the sprawling cult. They both helped foster the incel (“involuntary celibate”) movement.
As 8chan became the hot spot for mass shooters and domestic terrorists to upload their manifestos and crimes, there were calls to have the site knocked offline—by whom, exactly, it wasn’t clear, but it folded quickly into a debate about repealing the U.S. internet law Section 230. And there were free speech defenses by those who may find the language on the chans abhorrent, but who think it should be protected. That tug-of-war continues.
But it was the online denizens pressuring internet service providers to cut off 8chan who ultimately proved most effective. For a time.
The playbook has now been repeated for a raft of troublesome online websites: the neo-Nazi Daily Stormer, Twitter clones Parler and Gab, and a constellation of QAnon fan pages.
Each has been targeted and taken offline, only to spring up again with the help of an ideologically friendly web host, sometimes outside American jurisdiction. And they’re still churning out angry, radicalized young men.
Dollars may have part of the answer to that—and it comes from within 8chan itself, from a board that, until last year, was a mainstay of the site: Bunkerchan.
What 8chan is to far-right politics, Bunkerchan is to Marxism and left-wing causes. Dollars, who runs Bunkerchan, and his comrades raise an interesting possibility: Can online radicalization be prevented by old-fashioned political debate?
In recent years, as commentators spouting racist, xenophobic, transphobic, antisemitic, and conspiratorial beliefs have found themselves increasingly banned or unwelcome on Facebook, Twitter, and elsewhere, they have found refuge on 4chan and 8chan. Though the sites were once somewhat politically diverse, today users not keen on hard-right politics have fled, which has only reinforced the ideological uniformity. The /leftypol/ board clung on longer than most, but the growing toxicity, coupled with a series of service interruptions, eventually pushed them to create their own website: Bunkerchan.
Bunkerchan looks pretty close to its predecessors. The big difference is the logo: Two wreaths surround three gray turrets, with a red-and-black star above. “Welcome to Albania,” the landing page proclaims. Dollars explains the tagline is a reference to Enver Hoxha, longtime leader of the communist state amid an ideological falling-out with the Soviet Union. Hoxha built upwards of 170,000 bunkers throughout Albania to guard against external invasion, and the bunkers have become a popular meme in far-left internet forums today.
In some ways, Bunkerchan is just the left-wing answer to 8chan. But its differences are stark.
“Our rules are very specific. Spamming, low-effort posts are against the rules, as is trying to cloak reactionary ideas in pro-worker language,” Dollars said. In practice, these rules mean antisemitism, homophobia, and other offensive posts—not unknown among online far-left armchair theorists—are glancing and uncommon. “But if you’re a /pol/ user, even a nazi/fascist, we’ll debate you seriously and try to convince you that you’re wrong.”
In 2018, researchers set out to test the impact of online political bubbles. They went to Reddit to analyze two different subreddits: r/The_Donald, for supporters of Donald Trump, and r/HillaryClinton. Both channels, researchers noted, enforced their homogeneity—criticism of the preferred candidate was, effectively, banned.
The researchers found users were quite happy in their clubhouses. “In general, supporters appear more open, happy, and comfortable in politically homogeneous environments than when they interact in cross-cutting communication spaces,” they wrote.
And yet users didn’t stick to their safe spaces. Upwards of three-quarters of r/HillaryClinton users and more than 60 percent of r/The_Donald posters remained active in more general-interest subreddits, meaning they moved in spaces not uniquely geared toward their worldview. That included subreddits where both the Trump stans and the Clintonistas might meet.
“We find that only a minority of users active in politically homogeneous communication limit their participation exclusively to such environments,” the researchers concluded.
The finding makes sense. Trying to completely isolate yourself from all foreign thought online is quite difficult. You can’t wall off the comment section of your favorite video or filter out the replies to a funny tweet by ideology.
When the threat of digital cocoons emerged as a real problem, companies took steps to fight that trend. YouTube tinkered with its algorithm to stop it from recommending conspiracist and extremist videos. Facebook has increased its moderation of all groups, even private ones. Twitter recommends users follow accounts from outside their narrow worldview, Reddit’s front page offers a diverse network of interests and topics, and so on. This all came quite late, and only amid immense public pressure, but it’s hard to ignore the deflationary effect this has had on digital extremism.
But whereas users, even radicalized ones, on Twitter, Reddit, and Facebook are naturally exposed to mainstream culture and competing viewpoints, users on Gab or 8chan are on a steady diet of information that confirms their ideology. The uniformity of hard-right philosophy isn’t a bug on these websites; it’s a design.
Halfway deplatforming these websites brings a whole new problem. “Well, now, they’re in these closed ecosystems, talking to themselves, largely,” said Amarnath Amarasingam, an assistant professor of religion at Queen’s University and a researcher on radicalization and extremisms.
“I think part of the challenge for people working in this area, there’s an assumption from the public that it has to be all or nothing.”
The West is beginning to learn this lesson, in its frustrating effort to deprogram Islamic State recruits. “A lot of programming has shown some level of success,” Amarasingam said. “But it depends how you define success.” Even breaking an extremist ideology’s hold on someone could fall apart when they find themselves ensconced in the same chatrooms or watching the same videos. They show a penchant to fall down the same rabbit holes.
Dollars and his comrades aren’t the only ones trying new tactics. When Natalie Wynn got her start on YouTube, she said, it was mainly “center-right content that was rapidly racing toward far-right nationalism.”
Wynn has been called the “Oscar Wilde of YouTube.” An aspiring philosopher-turned Ph.D. dropout, she has amassed a massive following thanks to her lengthy, entertaining, sharply written monologues about politics, gender, and extremism.
While YouTube has always been the go-to platform for prank videos and music videos, its early political culture was idiosyncratic. It was a popular home for atheists and biologists to take on the growing debate around the role of religion in American life. There were some left-leaning and feminist vloggers, but it was hard to pinpoint any particular YouTube ideology.
“It started with, ‘We’re destroying creationists with facts and logic,’” Wynn said.
Then came Gamergate. Video game fans, under the less-than-convincing battle cry of “ethics in journalism,” launched a crusade against women in the video game and media industry. The campaign gave cover for a rise in anti-feminist agitation, which can be seen clearly in the growth of YouTube’s right-wing content. Ben Shapiro, Jordan Peterson, Steven Crowder, Lauren Southern, and Milo Yiannopoulos all saw their stars rise amid the counterculture backlash of Gamergate.
Wynn said the new ethos was: “‘We’re destroying feminists with facts and logic.’”
As YouTube became home for a burgeoning club of young, slick, reactionary identitarians, the platform suddenly became less friendly for those who didn’t fit into the new order. Feminists and liberals suddenly found themselves unwelcome. Videos that didn’t fit the new YouTube doctrine were given a “thumbs down” and littered with nasty comments.
“A lot of them didn’t last very long,” Wynn said. By the time she created her YouTube account, ContraPoints, in 2016, Wynn said, “there was no one really questioning these people.”
Wynn approached it as a challenge. She sought to mimic these reactionaries’ own tactics—the humor, the sleek editing, the lack of political correctness. She was going to use it to her own advantage. The only twist is that Wynn, a socialist trans woman, was going to turn it back onto the ones who had popularized the format. She began releasing videos decoding the symbolism of the alt-right, providing clear explanations on the usefulness of gender pronouns, and railing against capitalism.
“Instead of destroying Jordan Peterson with facts and logic,” she said, laughing, “I’m seducing Jordan Peterson with facts and logic.”
Wynn started discovering that the viewers who flocked to Peterson, Shapiro, and the whole lot weren’t ideological purists. “So much of what they hate about feminism, or whatever liberation movement, is the stuffiness,” she said. “They like being the edgy boys.” So, Wynn set out to “be more edgy than they are.”
Rather than hectoring or berating her viewers, she made them laugh at themselves.
“If you can entertain them, they’ve gone from laughing at a trans person to being entertained by a trans person,” Wynn said.
She said most of her early users were probably on the right wing of the spectrum. They were on YouTube to confirm their own worldview, only to be disrupted by ContraPoints.
Wynn, with over a million subscribers, hasn’t surpassed some of the larger right-leaning YouTube personalities, but she and others like her have ushered in a more diverse political culture on YouTube. “When I really think back to how it was in 2016, this has succeeded beyond my wildest dreams,” she said.
“There’s now practically a Jordan B. Peterson-dunking industrial complex,” she added jokingly.
Now, her and Ben Shapiro might be in competition for the very same eyeballs. “It’s a free marketplace of attention,” she said. And that has a moderating influence on everyone.
“2020 has been an interesting experiment online,” Wynn noted at the end of last year. “For a lot of people, online increasingly is everything.” It means that competition for viewers has grown all the more fierce.
That quest for views, which in turn corresponds to advertising dollars, delivered YouTube from being a no-go zone for anyone who didn’t ascribe to the reactionary, anti-feminist, anti-trans, anti-left movement. It may not perfectly reflect the diversity of the political spectrum now, but it’s not quite as monolithic as it used to be.
“It’s a check on letting an ideology grow into a mutant,” Wynn said.
The most common thread on Bunkerchan isn’t the come-to-Jesus moment that its users experienced reading Marx, or becoming enlightened about the class struggle. It’s about finally realizing how toxic their former stomping grounds really were.
“I can’t tell whether I changed or 4chan changed,” one user wrote. They recounted joining the board in 2010, and, like a frog in heating water, not realizing the steady tone shift surrounding them. “It’s a weird mixture of hostility and abject boredom,” they wrote. Looking back on their former clubhouse, 4chan, they said it “feels like an explicit Nazi site now.”
One user recounted their own obsessive, extremist political journey, through the “right wing pipeline” of “holocaust denialism, the Jewish question, cultural Marxism”—all that during their high school years.
The user, who claimed to be both Black and on the autism spectrum, recounted stumbling across /leftypol/ and “realized that liberals are not leftists.” It was easier to demonize socialists and liberals having been able to avoid them completely on 4chan.
What these stories share, however, is that it wasn’t a thoughtful Facebook post or Twitter thread that broke their 4chan or 8chan addiction: It was Bunkerchan. It seems that many were waiting for an alternative that met them where they were already at. Bunkerchan and 8chan aren’t on the same website anymore, but they are still kindred spirits. Neighbors, almost. From the site infrastructure to the edgy humor, Bunkerchan feels like 4chan. Just without the Nazis.
Bunkerchan users still use 8chan and 4chan on occasion—sometimes for mischief, like one user who managed to become the moderator of a QAnon board, using their power to ban swastikas and set up word filters to eliminate racial slurs.
“A part of it was that we saw it as our responsibility to try and sway /pol/ users who’d stop by to our side,” Dollars told me.
This is the sort of ideological disruption that can’t come from the top down. Moderation might keep a platform clean of beheading videos and appeals to violence, but it doesn’t do much to change the mind of the person posting that content.
“There’s always gonna be forums for these people, they’ll find a way, and censoring them just feeds into their oppression narrative,” Dollars said. “What we try to do is show that there really is another way, and it comes from being principled.”
This sort of dialogue with fascists, antisemites, and neo-Nazis has become unpopular in recent years, and for good reason—oftentimes, engaging with extreme beliefs is a surefire way to make them seem less extreme and to package the hate for a wider audience.
But this isn’t primetime on CNN; this is engaging with fascists on their home turf.
It’s something Wynn summed up succinctly in one of her ContraPoints videos: “[Fascists] do not care about ‘diversity of opinions’—in fact, they actively oppose it.”
As Wynn noted in that video, now seen over 2 million times, debating Nazis and fascists doesn’t have to mean legitimizing their views. “Do not host debates with them on your YouTube channel,” she encouraged her viewers. But, she said, “if you’re just an average person without a big platform, then you actually can debate, and you should be debating. … Make them listen to a different perspective.”
Combating the rise of online extremism is going to require a diversity of tactics. Moderating and policing major social media sites is necessary. Deprogramming and anti-radicalization efforts will be needed for some extreme cases. But making them listen to a different perspective is what’s needed.
It won’t always be perfect. Those looking to rage against the machine aren’t going to be taken in by calm appeals from U.S. President Joe Biden—sometimes, they’ll have more in common with a web forum that lionizes an Albanian dictator. It is a form of harm reduction to channel a deep-seated disaffection with Western society away from race war toward class war. Or, really, whatever alternative may be in front of them.
“A big part of what /leftypol/ does is kind of throw a wrench in the narrative that /pol/’s kind of built for itself,” Dollars said.
Right now, these users are out of sight and out of mind. Left to their own devices, they may continue sliding down a part of self-radicalization. If we’re going to combat online radicalization, we need to find a way to break these chains of self-victimization. And Bunkerchan and its allies might just be it.