YouTube's algorithm is recommending violent gun videos to 9-year-olds, study finds

The nonprofit Tech Transparency Project created YouTube accounts mimicking the behavior of young boys with an interest in first-person shooter games
The nonprofit Tech Transparency Project created YouTube accounts mimicking the behavior of young boys with an interest in first-person shooter games Copyright Euronews
Copyright Euronews
By Sophia Khatsenkova with AP
Share this articleComments
Share this articleClose Button

New research shows that YouTube's powerful algorithms can flood young users with disturbing content, including depictions of school shootings.

ADVERTISEMENT

Could there be a link between gun violence and Youtube video recommendations? That’s what researchers at the Technology Transparency Project, an NGO based in the United States, wanted to find out. 

They created separate YouTube accounts posing as typical 9-year-old and 14-year-old boys living in the US who love playing videos. 

The analysts only watched content about video games, but some accounts clicked on the videos recommended by YouTube’s algorithm, while the other profiles ignored these suggestions.

The study found the accounts that clicked on the platform’s recommended videos were quickly flooded with violent content about school shootings and how to handle firearms – approximately 12 videos per day over the course of one month. 

The accounts that ignored the suggestions still received some violent videos but 10 times fewer than the others. 

For example, one video pushed to a minor account showed a young girl holding and firing a pistol.

Another video depicted a person firing a gun on a dummy head filled with lifelike blood and organs.

What’s concerning is these videos are reaching wider audiences. The Technology Transparency project tweeted a comment under a clip of a movie scene depicting a school shooting that was viewed nearly a million times.

One user commented that this scene is "what motivates him every day".

For Katie Paul, the Director of the Technology Transparency project, not only do these videos violate YouTube’s policies, but they could also send vulnerable children down a dangerous path. 

"It's just another example of the failures at quality control on a platform that has repeatedly told US Congress, the public, and parents that it's a safe platform and that their algorithms do not recommend increasingly extreme content. Our experiment showed that in fact, they do recommend it create increasingly extreme content," she told Euronews. 

'It's YouTube that is sending people to that gun shop'

It’s not the first time YouTube and other social media platforms have come under fire for promoting dangerous content.

Perpetrators behind certain mass shootings had used YouTube to find violent content and even sometimes live-streamed their attacks. 

For many years, video games have been the boogeyman for politicians when there is a school shooting. But you can play a first-person shooting game like Call of Duty and never end up in a gun shop. It's YouTube that is sending people to that gun shop.
Katie Paul
Director, Technology Transparency project

In posts on the platform, the shooter behind the school attack that killed 17 people In Parkland, Florida in 2018 wrote violent comments such as "I’m going to be a professional school shooter".

The gunman who killed eight people earlier this month at a Dallas shopping centre also had a YouTube account that included videos about assembling rifles and a clip from a school shooting scene in a television show.

But Paul insists it's not video games that are the issue but social media platforms and the lack of content moderation.

"For many years, video games have been the boogeyman for politicians when there is a school shooting. But you can play a first-person shooting game like Call of Duty and never end up in a gun shop. It's YouTube that is sending people to that gun shop," she explained.

In some cases, YouTube had already removed some of the videos identified by the study, but in other instances, the content still remains available.

ADVERTISEMENT

We reached out to YouTube for a comment but they did not get back to us by the time this article was published. 

In a statement sent to The Guardian, a YouTube spokesperson said: "In reviewing this report’s methodology, it’s difficult for us to draw strong conclusions. 

"For example, the study doesn’t provide context of how many overall videos were recommended to the test accounts, and also doesn’t give insight into how the test accounts were set up, including whether YouTube’s Supervised Experiences tools were applied".

Paul hit back at the criticism. "It's clearly stated in our report that the nine-year-old account that we set up using a parent account," she said. 

"Regardless of YouTube's parent tool, the company provides no way for you to turn off the algorithm if parents don't want outside content recommended to their children".

ADVERTISEMENT

For more on this story, watch our report from The Cube in the media player above.

Share this articleComments

You might also like