Vincent Thian / AP
A Facebook vice president said that fewer than 200 people saw the Christchurch massacre when it was streamed live on the site. However, the video was viewed about 4,000 times before Facebook removed it, he added. In the hours that followed, countless more views appeared, as copies of the video grew faster than online platforms like Facebook could remove them.
Social media and video sharing sites have been criticized for their slow response to the first time live. Mass shootings taken from the shooter's ego perspective, the camera seemed to be mounted on the killer's helmet. But executives at the sites say they've done everything they can to prevent the spread of the video that may have been developed for an age of virality.
New Zealand Prime Minister Jacinda Ardern says she has had contact with Facebook COO Sheryl Sandberg to make sure the video is completely deleted from the platform.
And websites that continue to host footage of attacks such as 4chan and LiveLeak are being blocked by the country's major Internet access companies. "We've begun to temporarily block a number of sites that host footage of Friday's terrorist attacks in Christchurch," Telstra said on Twitter. "We understand that this could upset some legitimate users of these sites, but these are extreme circumstances and we believe this is the right thing."
Facebook says that 12 minutes after the end of the 17-minute livestream a user has reported this video on Facebook. By the time Facebook was able to remove it, the video had been viewed around 4,000 times on the platform, according to Chris Sonderby, the company's vice president and deputy general counsel.
But before Facebook could remove the video, it could be at least one person uploaded a copy to a file-sharing site, and a link was posted to 8chan, a refuge for right-wing extremists. Journalist Robert Evans told NPR's Melissa Block that "8chan" is essentially the darkest, wettest corner of the Internet, essentially a neo-Nazi rendezvous with its main purpose being to encourage more people into possible acts of violence, radical right-wing terror radicalize "19659008" When the video was in the wild, Facebook had to deal with other users who were trying to upload it to this website or Facebook, which is on Facebook. The systems of Facebook automatically recognized and removed the shares that were "visually similar" to the blocked video, Sonderby said. Some variations of the video, such as screen captures, required the use of additional recognition systems, such as those that identify similar audio data.
Facebook says more than 1.2 million copies of the video were blocked during upload "and therefore prevented from being seen in our services." In the first 24 hours, Facebook removed another 300,000 copies of the video worldwide, it said. Another way to look at these numbers, TechCrunch reports, is that Facebook did not block "20%" of the copies when they were uploaded.
Other video-sharing sites also had to deal with an enormous influx of uploads. "The discs where the content was copied and then re-uploaded to our platform were unprecedented in nature," said Neal Mohan, YouTube's chief product officer, to Ailsa Chang of NPR. In the first hours, YouTube saw an upload every second, he said. (Note: YouTube is one of NPR's financial sponsors.)
The New Zealand Mosque shooting was the most disconcerting video I've ever seen. I saw little kids shot in the head. I am a Muslim who lives in New Zealand. it might have been my little brother, but I will not blame Australia for the action of an Australian, do you understand me?
– عمر (@OmarImranTweets) March 15, 2019
The Difficulty Blocking or deleting videos was aggravated by the video coming in different forms, Mohan said. "We not only had to deal with the original video and its copies, but also with all permutations and combinations" – tens of thousands of them, he said.
The I-Person's Point of View presented an unusual technical challenge to computers. Mohan was not trained to recognize videos from this perspective, NPR said. "Our algorithms literally have to literally learn as soon as the incident happens, without the benefit of a multitude of training data that must have been learned."
New Zealand companies say if they think about it You want to be associated with social media sites that can not effectively moderate content. "The events in Christchurch raise the question of whether website owners can engage consumers with microsecond advertising, so why can not the same technology be used to prevent this type of content from being broadcast live?" The Association of New Zealand Advertisers and the Committee for Commercial Communications said in a joint statement. "We urge Facebook and other platform owners to take immediate action to effectively moderate hate content before another tragedy can be broadcast online."
As online companies attempted to block the video using technology, the New Zealand government uses more traditional methods to stop the video from being distributed there. The country's chief censorship has classified the video as "offensive", which makes publishing illegal within the country. An 18-year-old, who was not involved in the attack, was charged after sharing the video and releasing a photo of a mosque along with the sentence "Achieved".