In fact, the entire attack seemed orchestrated for the social media era. Before it took place, a post in the anonymous forum 8chan – a particularly lawless forum that often contains racist and extremist contributions – seemed to anticipate the horror. It followed an 87-page manifesto filled with ideas against immigrants and Muslims, and referred the users to a Facebook page hosting the live stream. Posts on Twitter also seemed to announce the attack.
The attacks took place in the seemingly unlikely place of Christchurch, New Zealand, which was still recovering badly following a devastating earthquake that devastated thousands of buildings in 2011. The city's population declined sharply after this event. The emigration was largely filled by migrants, many of whom were hired to help rebuild the city. New Zealand Prime Minister Jacinda Ardern said after the shootings that "many of those who are directly affected" were likely to be migrants or refugees.
But this attack was far more than the influx into Christchurch. It was about the rise of white online supremacy and the power of social media in spreading that message.
An Internet-driven hatred
At first glance, the shooter's "manifesto" seems to recall earlier white-nationalist killers such as Anders Breivik, a terrorist who had committed Norway in 201
However, this document is distinguished by its sarcastic language, deliberate red herring, and allusions to the online meme culture, indicating an Internet-driven evolution of nationalist hatred after a shootout.
"But this manifesto is itself a trap for journalists looking for the meaning of this terrible crime," Evans adds. "There's truth in it and valuable pointers to the shooter's radicalization, but for the most part, for lack of a better word, 'shit posting'."
In other words, the whole thing could be described as a great exercise in murderous trolling.
Take another example. Prior to the attack, the shooter ordered his online viewers to subscribe to the PewDiePie YouTube channel, which has 89 million followers on the platform. PewDiePie, a Swedish creator whose real name is Felix Kjellberg, has in the past promoted old-right topics and criticized for praising an anti-Semitic YouTube channel.
but diverts possible criticism Lopatto says he is forced to inspire the atrocities. If one of his 17 million followers had missed the shootings before his posting, they were very aware of it, she writes.
Lee Jarvis, co-editor of the journal Critical Studies on Terrorism, says the Internet has done this. People held by minorities provide a space to connect with other like-minded people in a way that can normalize their worldview.
"There are fears that if you have a small number of people with the same ideas, the ideas will feel more legitimate and more prevalent than they actually are," says Jarvis.
The fact that the document is riddled with jokes, references, and memes from the Internet underscores that many white supremacists are radicalized when socialized online with one another.
The manifesto also sarcastically describes relatively unknown video games, such as Spyro the Dragon and Fortnite, which unleash the extremism of the attackers – seems to undermine the popular perception that only violent game culture has a radicalizing effect.
"I'm skeptical that video games play a direct role in terrorist attacks," says Jarvis. "But the populist culture that everyone consumes shapes how he goes about his daily life."
The gaming culture was certainly present in the conduct and stylization of Friday's murders – the weapon that was seen in the shot visually reminded of the first person shooting up games.
An instrument for the terrorists
Social media has been increasingly co-opted by terrorists in recent years. In 2013, al-Shabaab fighters were tweeting the attack on the Westgate shopping center in Nairobi, Kenya. By releasing updates as the fighters opened fire on buyers, they took control of the media and viewers.
"Terrorism is political violence, so terrorists have always had to find publicity to influence political change," says Adam Hadley, director of the "Tech Against Terrorism," a group commissioned by the United Nations global technology industry assists in combating terrorist exploitation of their technologies.
"They want an audience – they'll always go where the biggest audience is, which could be traditional media, or it could be big social media platforms."
After the attack on Friday, Mia Garlick said, a spokeswoman for Facebook New Zealand, videos showing that the Christchurch shootings were taken off the platform.
"The New Zealand police quickly made us aware of a video on Facebook shortly after the livestream began, and we quickly removed both the shooter's Facebook and Instagram accounts and video," the spokeswoman said.
But hours after the attacks, CNN could still find videos on social media platforms including Twitter.
Tom Chen, a professor of cybersecurity at City University in London, notes that the European Commission has urged social media companies to "tear down terrorist propaganda within an hour." It may face future fines for non-compliance, "because most of the distribution occurs within the first two hours of uploading a new video," he adds.
Software to remove such materials. "If the terrorist video looks like a video game, it would be very hard for an automated classifier to tell the difference between this terrorist video and a video game," he says.
For others, the idea of reversing or revising such technologies would be a violation of our freedoms.
"This has already been said in debates about the live transmission of suicide," says Jarvis. "On the one hand, companies have a responsibility for how people use their technology, and the flip side is censorship and who is being reviewed and how they are being reviewed."
Technology like cars can also be used by humans to harm others, adds Jarvis, but laws have been introduced to promote their safe use. "It depends on the risk we want to live with."