The family of an American killed in the 2015 Paris terror attacks are taking YouTube to court for its role in the tragedy.
Nohemi Gonzalez was a 23-year-old student on a semester abroad in France during the November 2015 Paris attacks that targeted stadiums, restaurants and the Bataclan theatre. Of the 130 dead, Gonzalez was the only American victim.
Now, Gonzalez’s family have brought a case against YouTube, and its parent company Google, to the Supreme Court of the United States (SCOTUS). The Gonzalez family argue that YouTube played a role in the death of their daughter by allowing ISIS to post and recommend recruitment videos on the website.
Currently, the US law Section 230 protects websites from criminal blame over the content they host. Section 230 argues that websites like YouTube are not the same as publishers such as newspapers, and the scale of content uploaded means it would be too great an undertaking to make website hosts legally liable for the contents' messages.
The SCOTUS is the highest court in the country and has the ability to change laws across all the US’s states. With “Gonzalez v. Google” reaching the SCOTUS, the way the country permits social media platforms to forgo liability over content could change dramatically.
Changing Section 230
A lower court previously recognised that Section 230 “shelters more activity than Congress envisioned it would.” However, that court - The US Court of Appeals for the 9th Circuit - believed the job of clarifying Section 230 was for Congress and not the courts.
The case the Gonzalez family wants the SCOTUS to consider is that although YouTube may not be considered liable for content that is uploaded onto it, YouTube should still be liable for the way it recommends content to people. When YouTube provides unsolicited video recommendations, that is akin to acting like a traditional publisher again, they argue.
Section 230 has long been a controversial law in the US. Conspiracy theories, disinformation, and material harmful to children have often spread unimpeded due to the lack of legal liability social media platforms.
By the Gonzalez case focusing not on the content uploaded to YouTube, but on the way YouTube recommends videos, it may find a way to force social media platforms to regulate its content recommendation algorithms more.
The Biden administration has filed a submission to the SCOTUS agreeing with the Gonzalez family, stating that the recommendations “implicitly tells the user that she ‘will be interested in’” which is a statement from YouTube the publisher, not a statement within the content uploaded independently.
YouTube has expressed sympathy to the Gonzalez family, but denied a role in the attacks, noting that of the extremist videos it removes from the platform, 95% were automatically detected.
“Undercutting Section 230 would make it harder for websites to do this work,” YouTube spokesperson Ivy Choi told ABC News. “Websites would either over-filter any conceivably controversial materials and creators, or shut their eyes to objectionable content like scams, fraud, harassment and obscenity to avoid liability - making services far less useful, less open and less safe.”
Facebook insider turned Congress whistle-blower Frances Haugen also told ABC, “we have the tools, but all these things decrease usage. They make the companies a little less money.”
“So in a world where our business models are fueled by clicking on ads, there aren't independent market incentives for making products that help people be healthy and happy,” Haugen said.
A second antiterror case
The Gonzalez case isn’t the only one that the SCOTUS will examine in its next sessions around social media platforms’ role in terror attacks.
Filed by the family of Nawras Alassaf, who was killed in a January 2017 ISIS attack in Istanbul, the second suit alleges that Twitter and other tech companies willfully did not act to keep ISIS content off their platforms.
“Twitter v. Taamneh” will be argued by the SCOTUS tomorrow (22 February) and will consider whether the Taamneh family’s claim that Twitter aided-and-abetted ISIS by not more actively removing its content from the platform.
Twitter has argued that it would be an overreach as they weren’t directly aware of the content that the family accuses them of allowing on the platform.
As the Taamneh family suit is based around the Antiterrorism Act and not the Section 230 act, advocates for Google and YouTube have argued that if the SCOTUS finds Twitter not liable under the Antiterrorism Act, then there is no need to litigate over Section 230.