The Online Spaces That Enable Mass Shooters

The eighteen-year-old who committed a racist killing spree in Buffalo last weekend spent many months developing his plans on the Internet.

The New Yorker/May 19, 2022

By Kyle Chayka

Before Payton Gendron carried out a racist mass shooting at a Tops grocery store in Buffalo, New York, last weekend, killing ten people and wounding three more, he spent many months developing his hate crime on the Internet. Logs from a weapons-focussed group called Plate Land on the social platform Discord, collected by the media nonprofit Unicorn Riot, reveal that he had been discussing the efficacy of body armor. “Wouldnt it hurt a lot being shot with any type of bulelt with armor on,” he wrote under the username Jimboboii, in early 2021. Thousands of other Discord messages that have circulated in recent days make his intentions crystal clear: “It’s time to stop shitposting and time to make a real life effort shitpost. I will carry out an attack,” Gendron, who is eighteen, declared, in December. At that time, he’d already identified March 15, 2022, as the date for his attack, referencing the third anniversary of the mass shooting in Christchurch, New Zealand, in which fifty-one people were killed. (In the end, he got delayed by a case of covid-19.) Gendron later posted a manifesto on the anonymous message board 4chan, explaining his motives, which centered on “replacement theory,” the false conviction that white people are facing genocide and being replaced by immigrants. His attack would “intimidate the replacers already living on our lands,” he wrote. During the shooting, he used a GoPro to live-stream on Twitch, a platform that is best known for allowing people to watch other people play video games. The stream was taken down after two minutes, a swift response on Twitch’s part, but the footage has since proliferated across the Internet and received millions of views through Twitter and other social platforms.

Gendron, who was indicted by a grand jury on Thursday for first-degree murder, was hardly the first mass shooter to make use of a disturbing kit of Web sites and apps to plan, execute, and document an attack. The Christchurch killer, in 2019, also live-streamed his rampage, on Facebook, for as long as seventeen minutes. A German mass shooter the same year used Twitch to broadcast his attack outside a synagogue; it was seen by more than two thousand people on the platform. Parts of Gendron’s writings were lifted directly from the Christchurch shooter’s own manifesto, which was posted on social media and has become a reference point for racist extremists. In El Paso, Texas, the far-right gunman who killed twenty-three people at a Walmart, in 2019, published a screed on the forum 8chan. Discord has reported that it banned more than two thousand communities that were “organizing around hate, violence, or extremist ideologies” in the second half of last year. Yet Gendron freely discussed the minutiae of his violent plans on the platform. What’s most chilling about his reams of messages is how racist bile and criminal scheming coexist alongside the kinds of mundane updates that can be found everywhere on the Internet. “the White population in the US will be REPLACED by shitskins,” Gendron wrote, using a racist slur. He announced his aim to “take back the cities” from nonwhite people and described his support for eugenics. He posted about assault-rifle magazines, live-stream-technology troubleshooting, and what he ate for lunch. “I had a very bad night of sleep, I would be working on my manifesto but I cant think right now,” he wrote. He recommended weapons stores (“I like the guy at Vintage Firearms”) and noted, of going to a flea market with a friend, “It was a nice break from preparing for the attack.” He researched when his targeted Tops grocery store, which is situated in a majority-Black neighborhood, was most likely to be full of locals: “3-5 PM is where it’s busiest according to google maps.” He tallied out loud how many people he hoped to kill: “I’m expecting 10-20 people dead from area 1, 5-10 people dead from area 2, and another 5-10 people dead from area 3.”

Discord has become a haven for Gen Z-ers, who use it to hang out with their friends online, but older generations who still rely on Twitter and Facebook may be wholly unaware of it. Like Twitch, which is owned by Amazon, Discord emerged from the gaming industry, marketing itself to players who needed to talk with one another in real time. (Video games were once blamed for influencing their players to commit violence; today it seems obvious that the content of games is less pertinent than the demographic that gaming Web sites attract: bored and isolated male adolescents.) Since Discord launched, in 2015, it has grown into an all-purpose tool for hosting various kinds of digital communities. More than a hundred and fifty million monthly users connect on the platform through chat, video streams, and live audio conversations. The Discord app looks a bit like the workplace tool Slack, if Slack allowed you to flip through many different groups at once. When you log in, the left side of your screen shows a vertical bar of avatar thumbnails representing various chat rooms to which you belong, plus direct messages with individual users. Once you’ve clicked into a group, you can select from different topical sub-channels. (Plate Land, for instance, had a #bag-general channel, for discussions of gear, and another called #weapon-talk.) The user experience is chaotic and geared toward obsessive engagement. Unless you’re monitoring your updates and notifications constantly, they’re likely to pile up into an incomprehensible mass. Picture trying to monitor a dozen different Twitter accounts at once in a single interface.

As others have pointed out in recent days, “lone wolf” is something of a misnomer for right-wing terrorists whose ideas and methods are being explicitly nurtured through online communities. Such extremists don’t become radicalized solely by perusing the automated algorithmic feeds that the rest of us see on Facebook or YouTube. They seek out forums for those who have similar views, follow charismatic voices, and egg one another on. A mass shooter who finds inspiration in Christchurch or encouragement in chat rooms isn’t a solo operator or a spontaneous “copycat” so much as a digital comrade-in-arms. Many of Discord’s servers, as they’re called, are open to anyone and thus searchable on a central directory, not unlike Reddit’s browsable channels, or subreddits. But many others are private, which means that they can be accessed only by those with a link from their administrators. Discord told me, through a spokesperson, that Gendron was posting his messages in an “invite-only server” that he used as a “personal diary chat log.” According to the spokesperson, no one else saw the contents of the server until roughly thirty minutes before the shooting, when “a small group of people” were invited to join. Discord wouldn’t comment on whether Gendron was active on the platform beyond Plate Land and his personal server. But even when he was purportedly posting privately, he seemed aware of his own potential to influence others. “Im quite uncomfortable giving out so much of my personal thoughts and feelings, but perhaps it’ll be useful for someone,” he wrote on Discord, on March 4th. In some ways, the forces that encourage mass shooters are bleakly similar to those that fuel the careers of any influencers, drawing passive content consumers into the orbit of particularly vocal posters.

In other ways, though, the digital tools that extremists favor are breaking with the dominant social networks of the past decade. Live-streamed videos are notoriously hard to moderate and censor, because they must be caught and removed in real time. Whereas YouTube requires users to have a fifty-person following before they’re permitted to live-stream, Twitch is designed to let anyone broadcast immediately. (The company said this week that Gendron “has been indefinitely suspended from our service” and that it is “monitoring for any accounts rebroadcasting this content.”) Discord, meanwhile, is part of a movement in social media to create smaller, more private online communities for those eager to flee the Internet’s chaotic “public squares.” But the flight to privacy, as it’s sometimes called, makes identifying problematic user activity more difficult. Watchdog groups and journalists often spot problematic content on digital platforms such as YouTube or Facebook before the platforms themselves act to remove it. That outside observation becomes much more difficult when content is hidden from public view, so platforms must be relied on to track communications internally. Unlike on WhatsApp or other encrypted-messaging services, Discord’s hired moderators can review private content. According to the company spokesperson, Discord has a devoted “Counter-Extremism sub-team” and uses a “mix of proactive and reactive tools,” including machine learning, to root out violent and hateful ideologies “before they are reported to us.” Yet Gendron freely posted about his violent plans up until just days before he carried them out. On May 1st, he wondered, “Maybe I should block the one of the backdoors of tops with my car? Just saw how the binghamton shooter did his attack, I haven’t even thought of blocking the doors.” On May 9th, he wrote, “No matter what do what is right for you people and your race.” Discord said that it only became aware of Gendron’s personal server and shut it down “immediately following the attack.” The question now is why the platform’s detection mechanisms so blatantly failed. (On Wednesday, New York attorney general Letitia James announced an investigation into Twitch, Discord, and other platforms in connection with the shooting.)

Keeping extremist content off of social platforms will always be a necessary game of whack-a-mole. As Kathleen Belew, a scholar of far-right extremism, wrote in a recent Times op-ed, the replacement theory that shooters such as Gendron espouse has moved from the fringes into the mainstream, and the “window for action is closing.” Yet, even if a platform manages to root out a hate group before something terrible happens, it can’t stop the members from regrouping elsewhere online. An instructive example comes from the online activity leading up to the January 6th riot on the Capitol. One long-running subreddit focussed on pro-Trump content, r/donaldtrump, was banned after the invasion, for hosting calls for violence. Users of the subreddit had committed “policy violations,” according to a Reddit statement. But another subreddit that had been previously banned, r/The_Donald, reëmerged before January 6th as a Web site called, where users were also active in planning the riot. When that site was shut down, too, on January 21, 2021, its users moved to the domain In the case of the first forum, Reddit’s crackdown was too little, too late; in the case of the second, the platform’s response was more timely but still ineffectual. Gendron’s constant “shitposting” on Discord was evidently central to his planning of mass murder; without that venue, it’s possible that he would not have gone through with his attack. Unfortunately, it’s equally possible that he would have simply brought his compulsive plotting someplace else.

To see more documents/articles regarding this group/organization/subject click here.

Educational DVDs and Videos