Sajid Javid tells social media firms ‘enough is enough’ over terror videos

Facebook and YouTube rush to remove New Zealand shooting footage

Why Social Media Couldn’t Stop The New Zealand Terror Attack Video From Going Viral

The gunman in the attacks on two New Zealand mosques, which left 49 people dead, reportedly live streamed video of the shooting for almost 17 minutes.

According to the Herald, which viewed the video before it was removed by Facebook, the 28-year-old man livestreamed the attack via a helmet camera as he fired into the Al Noor Mosque in Christchurch, New Zealand. "Police alerted us to a video on Facebook shortly after the live-stream commenced, and we quickly removed both the shooter's Facebook and Instagram accounts and the video", the company said on its Twitter account.

"Shocking, violent and graphic content has no place on our platforms, and is removed as soon as we become aware of it", said YouTbue.

In a statement, Mia Garlick, spokeswoman for Facebook New Zealand, said that the company continues to "work around the clock to remove violating content from our site, using a combination of technology and people".

Additionally, the platform has also been proactive in removing edited version of the video, that does not show graphic content, keeping in mind the concerns of the local authority and out of respect for the people affected by the tragedy.

Mr Collins added that it was "very distressing" that the attack was live streamed on social media, and that "footage was available hours later".

A man and woman were also arrested in the hours following the attack, but the woman was released without charges.

Northern Ireland: 3 teens die in St. Patrick's Day crush
Sunday local time and declared it a major incident, calling out police, Fire Service and Environmental Health. Mr Hamilton said and "an extensive police investigation" had begun amid reports of fighting at the disco.

The director of the national Islamophobia monitoring service, Iman Atta of Tell MAMA (Measuring Anti-Muslim Attacks), condemned the attack, saying: "We are appalled to hear about the mass casualties in New Zealand".

Facebook Livestream, which the shooter appeared to use, is an 'extremely hard hole to plug, ' said Amanullah. We have said time and time again that far-right extremism is a growing problem and we have been citing this for over six years now. In 2017 it said it would hire 3,000 people to review videos and other posts, on top of the 4,500 people Facebook already tasks with identifying criminal and other questionable material for removal. 'Even people who are horrified are curious'.

US Senator Mark Warner, who sits on a committee that has questioned the social media companies, said on Friday that it wasn't just Facebook that needed to be held accountable.

The attacks have prompted social media sites to react to such content: Facebook, Twitter, and YouTube have been working to remove videos.

"These platforms are now racing to stamp out the sharing of media content for which they bear at least some responsibility", Burgess said.

We understand that Facebook can not remove every single video, but their latest proclamations paint a very skewed picture of how things are actually unfolding on their website.

"If I wanted to find that video now I probably could", Gilbertson said.

Latest News