Culture Feature

Complicity in the Age of Streaming

Is PewDiePie complicit in the New Zealand mosque shooting? Absolutely not.

Photo by Tech Daily on Unsplash

Before murdering 49 people across two mosques in Christchurch, New Zealand, the unnamed shooter announced via live stream: "Remember lads, subscribe to PewDiePie."

In the aftermath of mass shootings, it's natural to want to assign blame. Of course, first and foremost people blame the shooter. But a man doesn't just massacre 49 people in a bubble, so for most, blaming the individual isn't a good enough response. After all the individual is a product of society. So then what parts of society influenced the shooter's actions? In what corners did his hatred grow? Sure, he did the shooting, but who else is complicit?

In this particular shooting, we have some leads. First, the shooter posted a 73-page manifesto full of racist, white nationalist, and anti-Muslim rhetoric on 8chan. This offers insight into his mindset and the potential source of his radicalization. Second, the shooter streamed 17-minutes of the massacre on Facebook, the footage of which then spread around Reddit, Twitter, and various -chans. This action suggests he planned around social media – in other words, would he have pursued the attack if he didn't believe his message would receive adequate reach? And finally, there's his mention of PewDiePie. Let's start there.

Regardless of your personal opinions on PewDiePie, he bears absolutely none of the responsibility for this shooting. To suggest otherwise is ludicrous. Regardless of any prior controversies due to racist jokes, regardless of any suggestions that PewDiePie harbors alt-right sentiments, his brand does not revolve around sewing racial discourse. Whatever his personal views may be, he has never once publicly spread anti-Muslim rhetoric. That's not to say that some of his more controversial jokes and statements shouldn't be taken seriously, but rather that even the worst of them couldn't possibly make PewDiePie responsible for one of his 89+ million subscribers going on a rampage.

This is also decidedly different from an agitator like Alex Jones being considered complicit (at least in the public eye) when one of his listeners invades a pizza parlor or harasses the parent of a school shooting victim. The difference, of course, is a solid link. When an Alex Jones listener targets the parent of a school shooting victim over lies Alex Jones publicly spread about that same parent, that is a clear link. PewDiePie has never suggested that Muslims should be targeted in any way, shape, or form. Regardless of this shooters name drop, there is no clear link.

If anything, the shooter's name drop of PewDiePie was most likely an attempt to distract, bait, and sew further social discord. The shooter's manifesto is not worth summarizing. The content of his ideas is not worth discussing. His name is not worth mentioning. Don't let him accomplish his goals by siding against another person who was not involved in any way, shape, or form. You may not like PewDiePie, but this is not on him.

But if PewDiePie isn't complicit, who is? 4chan? 8chan? Facebook? Reddit? Twitter? The answer here is murkier.

Social media allows anyone and everyone to share ideas and content on an unprecedented scale. This is a great boon in many ways, giving people the capacity to connect and share interests, no matter how niche, to an extent that would have been impossible at any other point in history.

At the same time, social media, especially the kind benefitting from anonymity, allows hatred to spread at an alarming rate. People with sick ideas nurture and spread that hatred to others, using the targets of their rage as scapegoats for problems both real and imagined.

This malfeasance breeds and replicates, largely unchecked, across corners of Facebook, Twitter, Reddit, and the various -chans. The free market of ideas means that a lot of those ideas are poison, and all too often, ingesting poison results in death. So does that make social media complicit? Probably.

One potential solution is shutting down the spaces where hate speech breeds. There's a precedent here too, as Reddit banned a large number of hate subs on their platform in 2017 and, as a result, saw a decrease in hate speech on their platform as a whole. That being said, banning communities like these from congregating in one specific place doesn't necessarily mean they won't find each other somewhere else. Short of coordinated censorship across all major social media platforms, banning hate speech entirely would be impossible. Even then, what's to stop similar sites with like-minded admins from rising up as an alternative to mainstream platforms? And is censorship the road we want to go down, even when it comes to detestable speech and ideas? After all, who decides what is and isn't right?

In that same vein, good ideas are being spread through social media too – ideas about acceptance and tolerance, exposure to new ways of thinking that unite people instead of dividing them. But is that enough? Are terrorist attacks like this one inevitable in an age where white supremacy, racism, and racial tension seem to run rampant? I don't know.


Dan Kahan is a writer & screenwriter from Brooklyn, usually rocking a man bun. Find more at dankahanwriter.com



POP⚡DUST | Read More...

Down the Rabbit Hole: Exploring Weird YouTube

Fetishizing Autism: Representation in Hollywood

5 Romantic Movie Gestures That Are Actually Super Creepy