Social Platforms Have to Do More to Help Prevent Mass Shootings

Mass shooters want an audience. Social media is giving them just that.

Social Platforms Have to Do More to Help Prevent Mass Shootings
Photo:Getty (Getty Images)

Nineteen elementary school children and two adults were killed in a mass shooting in Uvalde, Texas, on Tuesday afternoon. It was the 212th mass shooting and the 27th school shooting of 2022 thus far.

Ahead of Tuesday’s mass shooting in Uvalde, and also the mass shooting in Buffalo earlier this month, we must reckon with another factor that’s emboldening mass shooters to act once they get access to guns: the violent “do it for the Vine” culture of far-right chatrooms. Chat spaces on Twitch, Discord, Reddit, and 4Chan are often overrun with teenage boys who, in many cases, encourage each other to act on their worst, most horrifying impulses for an audience.

To be unequivocally clear, mass shootings are caused by guns—guns that can end dozens of lives in a matter of seconds; guns that are far too accessible thanks to lawmakers who are bought and sold by the NRA, who serve in their offices as literal profiteers of death. Mass shootings are the unique product of a country that values the right to own a mass killing machine as a toy above human life. Nonetheless, online radicalization and the allure of streaming to perform violence for all the world to see are playing a growing role in pushing mass shooters to act.

Tuesday’s shooting was carried out by an 18-year-old local high school student named Salvador Ramos. It’s not yet clear what his motive was or if there was one. What is disturbingly clear is that he carried out the shooting at least in part because he wanted to post online and message people about it.

The shooting in Buffalo was also carried out by an 18-year-old gunman. He killed 10 people in a grocery store in a Black neighborhood. In addition to attempting to stream the rampage on Twitch, the shooter also dumped a racist manifesto espousing the white supremacist “replacement theory” on Discord shortly before the attack. The manifesto and its inclusion of “white genocide theory,” The Guardian notes, is mostly plagiarized and stolen from other extremists on 4chan. New York attorney general Letitia James announced on May 18 that she would investigate social media companies like Twitch, 4chan, 8chan, and Discord, all of which the gunman allegedly used to plan and stream the attack.

Prior to the shooting, the Uvalde shooter shared an Instagram post that included a photo of a firearm magazine, as well as several Instagram stories featuring assault rifles he’d purchased on his 18th birthday. Even more ominously, he tagged a random female Instagram user in his photos featuring gun. On Tuesday morning, ahead of the shooting, he reportedly messaged her, “I’m about to.” When she asked “about to what,” he replied, “I’ll tell you before 11.” He then urged her to respond, and told her, “I got a lil secret I wanna tell u.” His last message to her read, “Ima air out.” Not three hours later, he opened fire on Robb Elementary School in Uvalde.

In 2014, 22-year-old Elliot Rodger killed seven on a shooting rampage in Isla Vista, California, shortly after distributing his online manifesto detailing his desire to punish all women for sexually rejecting him. In 2018, radicalized by Rodger’s manifesto, a then-25-year-old Toronto man drove into a crowd and killed 11. That man did so not long after posting his own incel manifesto to Facebook.

Today, with 21 dead including 19 children, it would be extraordinarily unhelpful to scold about how the internet and social media are bad, or how mass shooters do what they do because they crave notoriety. We’ve known this. But the internet and, apparently, assault rifles clearly aren’t going anywhere. As long as they exist, we should call into question how platforms like Discord, Twitch, Reddit, and 4Chan have become breeding grounds for teenage boys to become indoctrinated into white supremacy and violent misogyny, and to boast about, amplify, and even livestream violent acts.

And we should certainly interrogate how misogyny and violence against women are seemingly precursors to nearly every mass shooting in recent history—two-thirds of mass shootings have been linked to domestic violence. Gunmen like the one in Uvalde—who first shot his grandmother before opening fire on an elementary school and reportedly frequently sent inappropriate messages to female co-workers—are often able to harm women without consequence, then access firearms and carry out massacres. Sgt. Erick Estrada of the Texas Department of Public Safety has since confirmed the Uvalde shooter’s grandmother was injured and airlifted to a hospital.

Shortly after the Buffalo shooting earlier this month, the racial justice organization Color of Change called on Twitch to conduct an internal audit of white supremacist, extremist content on its platform, in light of how the Buffalo shooter was briefly able to stream the shooting. “For months, Color Of Change members have warned Twitch that they have not taken their obligations to Black people seriously enough,” the group said. “Twitch needs to answer for its role as the entry point in an internet ecosystem of harm.” Color of Change also cited the Buffalo shooting specifically as evidence of how Twitch’s enabling of white supremacist ideation can translate into real-life violence.

Twitch, Discord, and other popular spaces that are used for video game chats primarily attract teenage boys and young men. They’re also overrun with some of the most vile, dangerous, racist and misogynist commentary. Following these recent shootings, the very least platforms like Twitch and Discord could do is perform audits and release tangible policy changes regarding how they’ll address online extremism. Or state-level lawmakers, like James in New York, can wield their offices to investigate these platform instead of doing what Texas attorney general Ken Paxton is doing, which is investigating the families of trans kids.

There are no clear-cut solutions, here. Violent video games and toxic corners of the internet aren’t killing people: Guns are killing people. But while young men who are radicalized by online platforms are carrying out mass killings, it’s terrifying that powerful people like Elon Musk are proudly calling for an end to social media content moderation—and are being lauded by the likes of both white nationalist, 19th amendment opponent Nick Fuentes, and supposedly reasonable Twitter CEO Parag Agrawal.

Ultimately, thousands are dead in this country due to shootings. With or without Congress, it’s time for any and all other institutions—including big tech and social media companies—to act.

Inline Feedbacks
View all comments
Share Tweet Submit Pin