A group that represents French Muslims is suing Facebook and YouTube for streaming video of the March 15 shootings at two mosques in New Zealand that left 50 people dead.
The French Council of the Muslim Faith (CFCM) said Monday that the organization launched a formal complaint with prosecutors in Paris against the French offices of the two tech giants for “broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor,” according to a copy of the complaint obtained by AFP.
CFCM President Ahmet Ogras told CNN that the organization is accusing Facebook of not taking down the video quickly enough. Such acts mentioned in the complaint are punishable in France by up to three years in prison and a 75,000 euro ($85,000) fine.
“This [is] not admissible, Facebook must take their part of responsibility in this and must do everything to anticipate these livestreams, as much as [they do with] hate messages and Islamophobia on their networks,” he told CNN.
Facebook is “examining the complaint” and “cooperating with the authorities and our teams remain fully mobilized,” the company told HuffPost. YouTube did not directly address the CFCM’s complaint, instead providing an earlier statement about the attack, saying it saw an “unprecedented” volume of attempts to post footage from the shooting and removed tens of thousands of related videos.
Facebook previously said it “quickly” removed the original livestream showing the massacre at the two mosques after New Zealand police informed the social media network of the gunman’s video, which was filmed in the style of a first-person-shooter video game. But according to AFP, the CFCM said Facebook took 29 minutes after the beginning of the broadcast to take down the video.
By then, the 17-minute video had been replicated and shared over and over again across the internet, including on Facebook. The platform said it removed more than 1.5 million videos of the massacre in the first 24 hours after the livestream. Company spokeswoman Mia Garlick later said Facebook blocked 1.2 million of them from being uploaded, which means 300,000 shooting videos were available on the platform.
Tech giants like Facebook and YouTube have said they’re working on fighting the spread of violent and inappropriate content through their artificial-intelligence systems and human moderators. Twitter suspended the alleged gunman’s account, on which he apparently shared a 74-page white supremacist manifesto. But many leaders say that it’s obviously not enough.
New Zealand Prime Minister Jacinda Ardern said days after the shooting that tech companies have “a lot of work” to do to curb the fast spread of content that shows or promotes violence. British Labour Party Deputy Leader Tom Watson said that YouTube should suspend all new uploads if the platform can’t halt the spread of such videos.
Last week the U.S. House Committee on Homeland Security called on the tech giants’ top leaders to explain how their platforms are working to stop violent, terrorist content from spreading.
This article has been updated with comment from Facebook.