Where is the line between debate and bullying on social media?
YouTube published a press statement on June 5 outlining their "ongoing work to tackle hate." But their blog post comes amid an outcry concerning how the platform has treated a dispute between a conservative pundit and a gay reporter, which has thrown into question the responsibilities of platforms to protect users while still allowing free speech.
Who is criticising YouTube and why?
Carlos Maza is a reporter for Vox, an American media company, who runs a series known as "Strikethrough" on YouTube, which aims to document the changing media landscape in the Trump era.
On May 30, Maza tweeted a thread detailing anti-gay made by Steven Crowder — a prominent Conservative comedian and host of "Louder with Crowder". The reporter shared a video highlighting a series of clips from the show in which Crowder calls him a "lispy sprite," "little queer," "gay Mexican guy" and “token Vox gay atheist sprite”. In one video, Crowder claims that the Vox host is "being given a free pass as a crappy writer because [he is] gay."
In his now-viral Twitter thread, Maza said that each time "Louder with Crowder" featured a debunk of his video he would experience abuse across numerous social media platforms.
Maza told Euronews he first flagged Crowder's content to YouTube in late 2018, following what he said was a 'doxxing' incident. Doxxing occurs when personal information, such as a phone number or address, is leaked on the internet usually for malicious purposes.
The Vox reporter said that he was bombarded with hundreds of messages at the same time to "debate Steven Crowder". YouTube said none of Maza’s personal information was ever revealed in Crowder's content reviewed by the tech company and in content published on June 5, Crowder said that he has never encouraged doxxing in any form.
Although his employer, Vox, approached YouTube, it "didn't do anything then," he told Euronews. Maza continued to flag on his personal account times when Crowder commented on his race or gender. "Finally I got frustrated and decided the only way to get anyone to pay attention was to edit it all together and put into a video and publish it on the internet."
On June 3, Crowder said he was used to "media organisations trying to smear" him and later posted a "rebuttal" to Maza's Vox videos.
What did YouTube find?
In a thread posted on June 6, YouTube responded, saying that although they found the language "clearly hurtful, the videos as posted don’t violate our policies."
YouTube said: "As an open platform, it’s crucial for us to allow everyone–from creators to journalists to late-night TV hosts–to express their opinions w/in the scope of our policies. Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our site."
In all the videos flagged to YouTube, Crowder never instructed his viewers to harass Maza on YouTube or any other platform, according to the tech giant. The main aim of the videos was to respond to opinions expressed by Maza rather than to harass or threaten, according to YouTube.
Has YouTube created a demeaning culture on its platform?
"A business model that is driven by engagement has the capacity to derail policy made to protect users," Maza explained. "Abuse and harassment are cheap, frequent and highly engaging" and "triggers people's tribal responses," he added, drawing people to that kind of content.
Content that is "toxic" or "meant to demean a marginalised group performs incredibly well," Maza reiterated. "YouTube has decided that their entire business is built around rewarding content that performs well," he said.
Maza, who has been a prominent figure on YouTube for years, said that bullying behaviour has increased since the beginning of his time as a content creator. "It's not a platform that has a few bad apples. Its a platform that is now basically dominated by [them]," he said.
How can you balance free speech and what can be construed as bullying?
This isn't the first time YouTube has been under the spotlight in such a debate, and it isn't the only platform to have faced such an issue. An edited video of Democrat Nancy Pelosi, Speaker of the House of Representatives, circulated on both YouTube and Facebook last month, making her appear intoxicated with the politician slurring her words.
Noting that the video was falsified, YouTube took down the content. Although Facebook fact-checkers identified the video as doctored, it did not remove it from its platform — citing its guidelines where it states that content does not need to be true for it to remain on site. Pelosi heavily criticised Facebook, saying that it showed the company's willingness
What are YouTube's community guidelines?
YouTube has a set of guidelines which outline that "content or behaviour intended to maliciously harass, threaten or bully others is not allowed on YouTube." Among the guidelines, YouTube states that content that is "deliberately posted in order to humiliate someone," would conflict with their policy. Should policies be violated, content is removed and your channel is removed should it receive three strikes.
YouTube says that it has removed over 47,000 videos in the first quarter of 2019 and over 10,000 accounts due to violation of policies on cyber-bullying and harassment.
In their most recent press release published Tuesday, YouTube announced their plan to limit hate speech by specifically prohibiting videos that justify discrimination based on age, gender, race, caste, religion, sexual orientation or veteran status. The platform said that it is their responsibility to "prevent our platform from being used to incite hatred, harassment, discrimination and violence."
With varying policies across social media platforms on how to deal with sensitive content, balancing bullying behaviour whilst supporting freedom of expression has proved to be a tricky line to tow.
Euronews reached out to Steven Crowder for comment at the time of publication.