Facebook Removes 1.5 Million Videos Of New Zealand Shooting

In the aftermath of New Zealand’s deadly mosque shootings, Facebook removed more than 1.5 million videos worldwide of the massacre that live-streamed on its platform.

In an update released Saturday, company spokesperson Mia Garlick said 1.2 million of those clips were blocked before they could be posted.

That means 300,000 made it online.

Reached for comment by HuffPost, Facebook did not say exactly how long those videos were on the platform and how many times they may have been shared.

HuffPost was informed that the New Zealand government told news outlets and online platforms on Sunday that sharing any of the footage ― regardless of whether it was edited ― violates local law.

Initially, Facebook allowed videos and images of the event that didn’t show the violence if they were being used for news purposes or to condemn the attacks. But Garlick said all versions of the video will be taken down, even if they do not include graphic imagery.

In a Friday statement to HuffPost hours after the violence unfolded, Garlick said Facebook’s teams were “working around the clock to respond to reports and block content, proactively identify content which violates our standards and to support first responders and law enforcement.”

Garlick added that all of the flagged videos would be added to a database allowing the platform to detect and remove any re-uploaded copies automatically.

Facebook isn’t alone among tech companies in struggling to root out content related to the attacks. YouTube has also launched an effort to stop the spread of the footage, and Twitter has suspended the alleged gunman’s account on which he appears to have shared a 74-page manifesto detailing his hatred of Muslim immigrants.

The shooting killed at least 50 people at two separate Christchurch mosques, injuring dozens more.



Source link