Social media platforms are facing more pressure to effectively moderate content following Saturday's deadly shooting at a Buffalo supermarket.
In an attack live-streamed on the gaming website Twitch, a gunman filmed himself killing 10 people and injuring three others.
US police say they are investigating the incident as a “racially motivated violent extremist” shooting.
Twitch said they removed the original live-stream upload after less than two minutes but versions of the video have continued to appear elsewhere online.
New York state Governor Kathy Hochul has said that social media companies must be more vigilant in monitoring what happens on their platforms and stated that it was inexcusable that the gunman's video had not been removed "within a second".
The shooting and live-stream mirror similar attacks in Christchurch, New Zealand and Halle, Germany in 2019 that were also filmed and shared online. Facebook was widely condemned for taking 17 minutes to remove the live-stream video of a white supremacist who killed 51 people in two New Zealand mosques.
On Wednesday, New York state's top prosecutor said it had launched an investigation into the role social media companies played in the mass shooting.
The inquiry will examine how platforms were "used to stream, promote, or plan the event", the attorney general's office said.
EU act to better moderate content in the works
In an interview with the Associated Press, executive vice-president of the European Commission Margrethe Vestager said the Buffalo shooting had highlighted the need for EU regulation.
“It’s really difficult to make sure that it’s completely waterproof, to make sure that this will never happen and that people will be closed down the second they would start a thing like that. Because there’s a lot of live-streaming which, of course, is 100% legitimate,” Vestager said.
“The platforms have done a lot to get to the root of this. They are not there yet,” she added. “But they keep working and we will keep working.”
The European Union is currently preparing a Digital Services Act that will force social media platforms to better moderate content.
In a statement, a spokesperson for Twitch said the company has a “zero-tolerance policy” against violence and was monitoring for any accounts that were rebroadcasting the Buffalo live-stream.
Meta, which owns Facebook and Instagram, said that it quickly designated the shooting in the US as a “terrorist attack,” which triggered an internal process that identifies the suspect’s account and video.
In April, Twitter also enacted a new policy to remove accounts maintained by “individual perpetrators of terrorist, violent extremist, or mass violent attacks.”
The platform later said it was also “removing videos and media related to the incident” in Buffalo and “may remove” tweets disseminating the shooter’s alleged manifesto.
“We believe the hateful and discriminatory views promoted in content produced by perpetrators are harmful for society and that their dissemination should be limited in order to prevent perpetrators from publicizing their message,” Twitter said in a statement.