Amid backlash, Facebook removes 1.5M videos of New Zealand terror attack within 24 hours
In the aftermath of the horrific Christchurch shooting in New Zealand, social networking giant Facebook has removed millions of graphic videos and violating content of the terror attack from its platform. The action was taken by Facebook within 24 hours of the attack, as the company explained how it’s taking active measures to block these content right at upload.
“In the first 24 hours, we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload,” the US-based tech giant announced via a tweet late Saturday.
Facebook added that it’s been working “around the clock” to remove such violating content using “a combination of technology and people”. Additionally, the platform has also been proactive in removing edited version of the video, that does not show graphic content, keeping in mind the concerns of the local authority and out of respect for the people affected by the tragedy.
This comes after a hate-filled terror attack targeted two mosques in the Christchurch city of New Zealand. As of Sunday, the death toll was reportedly 50, while another 50 were claimed to be wounded, as per CNN. Media reports said that the gunman who attacked the mosques – identified as 28-year-old Brenton Harrison Tarrant – was to be charged with murder.
The accused live-streamed the terror incident using a heat-mounted camera on Facebook. Since then, social media platforms like Facebook, Twitter, and YouTube have been struggling to curb the spread of the violent content. Even though they have taken down several of these violent content, TechCrunch reports Facebook was unable to block 20 percent of the shooting video at upload.
On Sunday, New Zealand prime minister Jacinda Ardern addressed the issue, adding that social media giants have to face “further questions” regarding the live streaming of the incident.