Supreme Court Seeks Biden Administration’s Input on Texas and Florida Social Media Laws

WASHINGTON, January 13, 2023 – The Supreme Court could trigger a cascade of internet-altering effects that will encourage the proliferation of offensive speech and the suppression of speech and create a “litigation minefield” if it decides Google is liable for the results of terrorist attacks by entities publishing on its YouTube platform, the search engine company argued Thursday.

The high court will hear the case of an America family whose daughter Reynaldo Gonzalez was killed in an ISIS terrorist attack in Paris in 2015. The family sued Google under the AntiTerrorism Act for the death, alleging YouTube participated as a publisher of ISIS recruitment videos when it hosted them and its algorithm shared them on the video platform.

But in a brief to the court on Thursday, Google said it is not liable for the content published by third parties on its website according to Section 230 of the Communications Decency Act, and that deciding otherwise would effectively gut platform protection provision and “upend the internet.”

Denying the provision’s protections for platforms “could have devastating spillover effects,” Google argued in the brief. “Websites like Google and Etsy depend on algorithms to sift through mountains of user-created content and display content likely relevant to each user. If plaintiffs could evade Section 230(c)(1) by targeting how websites sort content or trying to hold users liable for liking or sharing articles, the internet would devolve into a disorganized mess and a litigation minefield.”

It would also “perversely encourage both wide-ranging suppression of speech and the proliferation of more offensive speech,” it added in the brief. “Sites with the resources to take down objectionable content could become beholden to heckler’s vetoes, removing anything anyone found objectionable.

“Other sites, by contrast, could take the see-no-evil approach, disabling all filtering to avoid any inference of constructive knowledge of third-party content,” Google added. “Still other sites could vanish altogether.”

Google rejected the argument that recommendations by its algorithms conveys an “implicit message,” arguing that in such a world, “any organized display [as algorithms do] of content ‘implicitly’ recommends that content and could be actionable.”

The Supreme Court is also hearing a similar case simultaneously in Twitter v. Taamneh.

The Section 230 scrutiny has loomed large since former President Donald Trump was banned from social media platforms for allegedly inciting the Capitol Hill riots in January 2021. Trump and conservatives called for rules limited that protection in light of the suspensions and bans, while the Democrats have not shied away from introducing legislation limited the provision if certain content continued to flourish on those platforms.

Supreme Court Justice Clarence Thomas early last year issued a statement calling for a reexamination of tech platform immunity protections following a Texas Supreme Court decision that said Facebook was shielded from liability in a trafficking case.

Meanwhile, startups and internet associations have argued for the preservation of the provision.

“These cases underscore how important it is that digital services have the resources and the legal certainty to deal with dangerous content online,” Matt Schruers, president of the Computer and Communications Industry Association, said in a statement when the Supreme Court decided in October to hear the Gonzalez case.

“Section 230 is critical to enabling the digital sector’s efforts to respond to extremist and violent rhetoric online,” he added, “and these cases illustrate why it is essential that those efforts continue.”



Source