The video continues to spread across the internet, illustrating how difficult it is to keep graphically violent images away from the public.
YouTube and other social media sites are working to remove the video apparently recorded by the shooter who killed at least 49 people in a mosque in Christchurch, New Zealand, but they have not been able to keep up. The video continues to spread across the internet, illustrating how difficult it is to keep graphically violent images away from the public.
The video was originally livestreamed on Facebook, which released a statement in the hours after the shooting detailing the company's plans to limits its spread.
"New Zealand Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter's Facebook and Instagram accounts and the video," Mia Garlick, Facebook's director of policy in Australia and New Zealand, said in an emailed statement. "We're also removing any praise or support for the crime and the shooter or shooters as soon as we're aware. We will continue working directly with New Zealand Police as their response and investigation continues."
Facebook was not able to remove the video before it had been captured by viewers. The livestream was taken down after about 20 minutes, according to timestamped archives of the Facebook page seen by NBC News. Facebook had removed the profile associated with the livestream about an hour and a half after the video first started streaming. The video then began to spread around the internet, including on YouTube and Twitter.
YouTube tweeted early Friday that the company was "working vigilantly to remove any violent footage."
A series of searches on YouTube conducted by NBC News on Friday morning based on keywords related to the shooting turned up more than a dozen versions of the video that included graphic violence uploaded in under an hour, according to details publicly available on the platform. Many links turned up pages where the video had been taken down with the message: "This video has been removed for violating YouTube's policy on violent or graphic content."
Facebook's Watch platform also hosted copies of the video, according to a basic keyword search. One graphic video clip had been available for roughly nine hours, according to the Facebook post's details.
The spread of the videos, particularly on YouTube, drew criticism on social media from users who said the company was not taking down the videos quickly as well as concerns that pieces of the video could end up interspersed in other videos targeted at young people.
Tom Watson, a member of British Parliament and deputy leader of the Labour Party, tweeted that YouTube should stop user uploads until it can contain the video.
"If YouTube don't have the capability to halt the spread of the NZ massacre videos — because they are going up faster than they can take them down — then they should suspend all new uploads at this time," Watson wrote.
A YouTube spokesperson said that the company is removing videos as they are found.
Similar searches on Twitter also turned up versions of the video. Some Twitter users reported seeing the video through the platform's autoplay feature, which begins playing videos without user interaction. Others pleaded with people not to spread the video.
A spokesperson for Twitter said the company has rigorous processes and a dedicated team in place for managing exigent and emergency situations such as this. "We also cooperate with law enforcement to facilitate their investigations as required," the spokesperson said.
The video's creation on Facebook and its spread across Twitter and YouTube come as those companies remain under pressure to better moderate their platforms and quickly remove a wider range of content. Facebook and YouTube in particular have said they are now investing heavily in automated moderation systems and human intervention to deal with the massive amount of content uploaded to their platforms every day.
YouTube's efforts appeared to have fallen short with respect to the shooter's video.
"I can confirm that there are at least five videos of the NZ massacre that have been posted on@YouTube within the last 2 hours. These are searchable under the most obvious keywords, which means that the company is not even screening obvious videos," tweeted Jennifer Grygiel, an associate professor of communications at Syracuse University.
YouTube did not immediately respond to an inquiry related to Grygiel's findings.
Others encouraged parents to take steps to make sure their children did not accidentally see the video or come across it spliced into other videos.
"Parents might consider a temporary draconian moratorium of YouTube at home given the elevated risk of NZ massacre content being maliciously spliced into young kids content and otherwise recommended to teens/pre-teens," tweeted David Carroll, an associate professor of media design at the New School who studies the spread of misinformation online.