Facebook said on Sunday that 1.5 million videos of the attacks were removed in New Zealand around the globe. It blocked 1.2 million videos during upload, which means they were not seen by users. Facebook did not say how many people saw the remaining 300,000 videos.
The New Zealand police alerted Facebook to the livestream, and Facebook said the shooter's Facebook and Instagram accounts and video were quickly removed. Facebook also said it was "removing any edited versions of the video that show no graphic content" as well as praise or support for filming.
In response, Facebook says it has hired tens of thousands of human moderators and invested in artificial intelligence to support the police.
"We continue to work around the clock to remove hurtful content using a combination of technology and technology people," said Mia Garlnick, spokeswoman for Facebook New Zealand.
US Sen. Mark Warner, who sits on a committee interviewing social media companies, said on Friday that not only Facebook would have to be held accountable.
"The rapid and widespread adoption of this hateful content – streamed live on Facebook, uploaded to YouTube, and boosted on Reddit – shows how easily the largest platforms can still be misused." It's becoming increasingly clear that YouTube, in particular, has become The role it played in facilitating radicalization and recruitment has not yet been fought, "Warner said in a statement sent to CNN.
Other companies committed to closely monitoring their platforms after the attacks.