trolling

4chan-may-be-dead,-but-its-toxic-legacy-lives-on

4chan may be dead, but its toxic legacy lives on

My earliest memory of 4chan was sitting up late at night, typing its URL into my browser, and scrolling through a thread of LOLcat memes, which were brand-new at the time.

Back then a photoshop of a cat saying “I can has cheezburger” or an image of an owl saying “ORLY?” was, without question, the funniest thing my 14-year-old brain had ever laid eyes on. So much so, I woke my dad up by laughing too hard and had to tell him that I was scrolling through pictures of cats at 2 in the morning. Later, I would become intimately familiar with the site’s much more nefarious tendencies.

It’s strange to look back at 4chan, apparently wiped off the Internet entirely last week by hackers from a rival message board, and think about how many different websites it was over its more than two decades online. What began as a hub for Internet culture and an anonymous way station for the Internet’s anarchic true believers devolved over the years into a fan club for mass shooters, the central node of Gamergate, and the beating heart of far-right fascism around the world—a virus that infected every facet of our lives, from the slang we use to the politicians we vote for. But the site itself had been frozen in amber since the George W. Bush administration.

It is likely that there will never be a site like 4chan again—which is, likely, a very good thing. But it had also essentially already succeeded at its core project: chewing up the world and spitting it back out in its own image. Everything—from X to Facebook to YouTube—now sort of feels like 4chan. Which makes you wonder why it even needed to still exist.

“The novelty of a website devoted to shock and gore, and the rebelliousness inherent in it, dies when your opinions become the official policy of the world’s five or so richest people and the government of the United States,” the Onion CEO and former extremism reporter Ben Collins tells WIRED. “Like any ostensibly nihilist cultural phenomenon, it inherently dies if that phenomenon itself becomes The Man.”

My first experience with the more toxic side of the site came several years after my LOLcat all-nighter, when I was in college. I was a big Tumblr user—all my friends were on there—and for about a year or so, our corner of the platform felt like an extension of the house parties we would throw. That cozy vibe came crashing down for me when I got doxed the summer going into my senior year. Someone made a “hate blog” for me—one of the first times I felt the dark presence of an anonymous stranger’s digital ire, and posted my phone number on 4chan.

They played a prank that was popular on the site at the time, writing in a thread that my phone number was for a GameStop store that had a copy of the ultra-rare video game Battletoads. I received no less than 250 phone calls over the next 48 hours asking if I had a copy of the game.

Many of the 4chan users that called me mid-Battletoad attack left messages. I listened to all of them. A pattern quickly emerged: young men, clearly nervous to even leave a message, trying to harass a stranger for, seemingly, the hell of it. Those voicemails have never left me in the 15 years I’ve spent covering 4chan as a journalist.

I had a front-row seat to the way those timid men morphed into the violent, seething underbelly of the Internet. The throbbing engine of reactionary hatred that resented everything and everyone simply because resentment was the only language its users knew how to speak. I traveled the world in the 2010s, tracing 4chan’s impact on global democracy. I followed it to France, Germany, Japan, and Brazil as 4chan’s users became increasingly convinced that they could take over the planet through racist memes, far-right populism, and cyberbullying. And, in a way, they did. But the ubiquity of 4chan culture ended up being an oddly Pyrrhic victory for the site itself.

Collins, like me, closely followed 4chan’s rise in the 2010s from Internet backwater to unofficial propaganda organ of the Trump administration. As he sees it, once Elon Musk bought Twitter in 2022 there was really no point to 4chan anymore. Why hide behind anonymity if a billionaire lets you post the same kind of extremist content under your real name and even pays you for it?

4chan’s “user base just moved into a bigger ballpark and started immediately impacting American life and policy,” Collins says. “Twitter became 4chan, then the 4chanified Twitter became the United States government. Its usefulness as an ammo dump in the culture war was diminished when they were saying things you would now hear every day on Twitter, then six months later out of the mouths of an administration official.”

But understanding how 4chan went from the home of cat memes to a true Internet bogeyman requires an understanding of how the site actually worked. Its features were often overlooked amid all the conversations about the site’s political influence, but I’d argue they were equally, if not more, important.

4chan was founded by Christopher “Moot” Poole when he was 15. A regular user on slightly less anarchic comedy site Something Awful, Poole created a spinoff site for a message board there called “Anime Death Tentacle Rape Whorehouse.” Poole was a fan of the Japanese message board 2chan, or Futaba Channel, and wanted to give Western anime fans their own version, so he poorly translated the site’s code and promoted his new site, 4chan, to Something Awful’s anime community. Several core features were ported over in the process.

4chan users were anonymous, threads weren’t permanent and would time out or “404” after a period of inactivity, and there were dozens of sub-boards you could post to. That unique combination of ephemerality, anonymity, and organized chaos proved to be a potent mix, immediately creating a race-to-the-bottom gutter culture unlike anything else on the web. The dark end point of the techno-utopianism that built the Internet. On 4chan you were no one, and nothing you did mattered unless it was so shocking, so repulsive, so hateful that someone else noticed and decided to screenshot it before it disappeared into the digital ether.

“The iconic memes that came out of 4chan are because people took the time to save it, you know? And the fact that nobody predicted, nobody could predict or control what was saved or what wasn’t saved, I think, is really, really fascinating,” Cates Holderness, Tumblr’s former head of editorial, tells WIRED.

Still, 4chan was more complicated than it looked from the outside. The site was organized into dozens of smaller sections, everything from comics to cooking to video games to, of course, pornography. Holderness says she learned to make bread during the pandemic thanks to 4chan’s cooking board. (Full disclosure: I introduced Holderness to 4chan way back in 2012.)

“When I switched to sourdough, I got really good pointers,” she says.

Holderness calls 4chan the Internet’s “Wild West” and says its demise this month felt appropriate in a way. The chaos that defined 4chan, both the good and the very, very bad, has largely been paved over by corporate platforms and their algorithms now.

Our feeds deliver us content; we don’t have to hunt for it. We don’t have to sit in front of a computer refreshing a page to find out whether we’re getting a new cat meme or a new manifesto. The humanness of that era of the web, now that 4chan is gone, is likely never coming back. And we’ll eventually find out if that’s a good thing or a bad thing.

“The snippets that we have of what 4chan was—it’s all skewed,” Holderness says. “There is no record. There’s no record that can ever encapsulate what 4chan was.”

This story originally appeared on wired.com.

4chan may be dead, but its toxic legacy lives on Read More »

why-trolls,-extremists,-and-others-spread-conspiracy-theories-they-don’t-believe

Why trolls, extremists, and others spread conspiracy theories they don’t believe


Some just want to promote conflict, cause chaos, or even just get attention.

Picture of a person using an old Mac with a paper bag over his head. The bag has the face of a troll drawn on it.

There has been a lot of research on the types of people who believe conspiracy theories, and their reasons for doing so. But there’s a wrinkle: My colleagues and I have found that there are a number of people sharing conspiracies online who don’t believe their own content.

They are opportunists. These people share conspiracy theories to promote conflict, cause chaos, recruit and radicalize potential followers, make money, harass, or even just to get attention.

There are several types of this sort of conspiracy-spreader trying to influence you.

Coaxing conspiracists—the extremists

In our chapter of a new book on extremism and conspiracies, my colleagues and I discuss evidence that certain extremist groups intentionally use conspiracy theories to entice adherents. They are looking for a so-called “gateway conspiracy” that will lure someone into talking to them, and then be vulnerable to radicalization. They try out multiple conspiracies to see what sticks.

Research shows that people with positive feelings for extremist groups are significantly more likely to knowingly share false content online. For instance, the disinformation-monitoring company Blackbird.AI tracked over 119 million COVID-19 conspiracy posts from May 2020, when activists were protesting pandemic restrictions and lockdowns in the United States. Of these, over 32 million tweets were identified as high on their manipulation index. Those posted by various extremist groups were particularly likely to carry markers of insincerity. For instance, one group, the Boogaloo Bois, generated over 610,000 tweets, of which 58 percent were intent on incitement and radicalization.

You can also just take the word of the extremists themselves. When the Boogaloo Bois militia group showed up at the Jan. 6, 2021, insurrection, for example, members stated they didn’t actually endorse the stolen election conspiracy but were there to “mess with the federal government.” Aron McKillips, a Boogaloo member arrested in 2022 as part of an FBI sting, is another example of an opportunistic conspiracist. In his own words: “I don’t believe in anything. I’m only here for the violence.”

Combative conspiracists—the disinformants

Governments love conspiracy theories. The classic example of this is the 1903 document known as the “Protocols of the Elders of Zion,” in which Russia constructed an enduring myth about Jewish plans for world domination. More recently, China used artificial intelligence to construct a fake conspiracy theory about the August 2023 Maui wildfire.

Often the behavior of the conspiracists gives them away. Years later, Russia eventually confessed to lying about AIDS in the 1980s. But even before admitting to the campaign, its agents had forged documents to support the conspiracy. Forgeries aren’t created by accident. They knew they were lying.

As for other conspiracies it hawks, Russia is famous for taking both sides in any contentious issue, spreading lies online to foment conflict and polarization. People who actually believe in a conspiracy tend to stick to a side. Meanwhile, Russians knowingly deploy what one analyst has called a “fire hose of falsehoods.”

Likewise, while Chinese officials were spreading conspiracies about American roots of the coronavirus in 2020, China’s National Health Commission was circulating internal reports tracing the source to a pangolin.

Chaos conspiracists—the trolls

In general, research has found that individuals with what scholars call a high “need for chaos” are more likely to indiscriminately share conspiracies, regardless of belief. These are the everyday trolls who share false content for a variety of reasons, none of which are benevolent. Dark personalities and dark motives are prevalent.

For instance, in the wake of the first assassination attempt on Donald Trump, a false accusation arose online about the identity of the shooter and his motivations. The person who first posted this claim knew he was making up a name and stealing a photo. The intent was apparently to harass the Italian sports blogger whose photo was stolen. This fake conspiracy was seen over 300,000 times on the social platform X and picked up by multiple other conspiracists eager to fill the information gap about the assassination attempt.

Commercial conspiracists—the profiteers

Often when I encounter a conspiracy theory I ask: “What does the sharer have to gain? Are they telling me this because they have an evidence-backed concern, or are they trying to sell me something?”

When researchers tracked down the 12 people primarily responsible for the vast majority of anti-vaccine conspiracies online, most of them had a financial investment in perpetuating these misleading narratives.

Some people who fall into this category might truly believe their conspiracy, but their first priority is finding a way to make money from it. For instance, conspiracist Alex Jones bragged that his fans would “buy anything.” Fox News and its on-air personality Tucker Carlson publicized lies about voter fraud in the 2020 election to keep viewers engaged, while behind-the-scenes communications revealed they did not endorse what they espoused.

Profit doesn’t just mean money. People can also profit from spreading conspiracies if it garners them influence or followers, or protects their reputation. Even social media companies are reluctant to combat conspiracies because they know they attract more clicks.

Common conspiracists—the attention-getters

You don’t have to be a profiteer to like some attention. Plenty of regular people share content where they doubt the veracity or know it is false.

These posts are common: Friends, family, and acquaintances share the latest conspiracy theory with “could this be true?” queries or “seems close enough to the truth” taglines. Their accompanying comments show that sharers are, at minimum, unsure about the truthfulness of the content, but they share nonetheless. Many share without even reading past a headline. Still others, approximately 7 percent to 20 percent of social media users, share despite knowing the content is false. Why?

Some claim to be sharing to inform people “just in case” it is true. But this sort of “sound the alarm” reason actually isn’t that common.

Often, folks are just looking for attention or other personal benefit. They don’t want to miss out on a hot-topic conversation. They want the likes and shares. They want to “stir the pot.” Or they just like the message and want to signal to others that they share a common belief system.

For frequent sharers, it just becomes a habit.

The dangers of spreading lies

Over time, the opportunists may end up convincing themselves. After all, they will eventually have to come to terms with why they are engaging in unethical and deceptive, if not destructive, behavior. They may have a rationale for why lying is good. Or they may convince themselves that they aren’t lying by claiming they thought the conspiracy was true all along.

It’s important to be cautious and not believe everything you read. These opportunists don’t even believe everything they write—and share. But they want you to. So be aware that the next time you share an unfounded conspiracy theory, online or offline, you could be helping an opportunist. They don’t buy it, so neither should you. Be aware before you share. Don’t be what these opportunists derogatorily refer to as “a useful idiot.”

H. Colleen Sinclair is Associate Research Professor of Social Psychology at Louisiana State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Photo of The Conversation

The Conversation is an independent source of news and views, sourced from the academic and research community. Our team of editors work with these experts to share their knowledge with the wider public. Our aim is to allow for better understanding of current affairs and complex issues, and hopefully improve the quality of public discourse on them.

Why trolls, extremists, and others spread conspiracy theories they don’t believe Read More »