ITI applauds Supreme Court after it 'reached a unanimous decision not to impose liability for use of algorithms'

Gavel 3577255 1920
The Supreme Court has said it would "decline to address" Section 230 of the Communications Decency Act in the case of Gonzalez v. Google | Pixabay

ITI applauds Supreme Court after it 'reached a unanimous decision not to impose liability for use of algorithms'

ORGANIZATIONS IN THIS STORY

The Supreme Court has said it would "decline to address" Section 230 of the Communications Decency Act in the case of Gonzalez v. Google, effectively protecting tech companies from being held responsible for posts on their platforms.

Industry groups like the Information Technology Industry Council (ITI) and the Computer and Communications Industry Association (CCIA) have applauded the Court's decision, saying it protects freedom of expression.

“Today’s U.S. Supreme Court decisions preserve long-standing legal protections for the use of online algorithmic processing systems and protect future innovations like AI that will bring enormous societal benefits," the council said in a release. "In our brief on behalf of the technology industry, ITI urged the Court to recognize that algorithmic tools organize the massive quantities of content on the internet and throughout the digital economy. We are pleased that the Court reached a unanimous decision not to impose liability for use of algorithms, as a contrary result would have caused disruption and harm to the U.S. economy, innovation and free expression.”

Gonzalez v. Google raised questions over liability concerning terrorist content that allegedly radicalized ISIS sympathizers to carry out acts of terror in Paris in 2015. Gonzalez has accused Google of violating the Anti-Terrorism Act through furthering ISIS' mission, and he has said Google should be held responsible for promoting ISIS messaging through targeted ads on its platform.

Gonzalez is also arguing that YouTube provided "material support" to ISIS by permitting ISIS to post hundreds of videos, which were then boosted by YouTube's algorithm, allegedly helping to increase "the growth and activity of ISIS," according to the Bipartisan Policy Center.

Section 230(c)(1): states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Without that provision, social media companies could potentially face lawsuits over content posted on their platforms. Section 230(c)(2) allows platforms to moderate content in order to cultivate a safe online environment," according to the Bipartisan Policy Center.

Free speech advocacy group CCIA called the Court's decision a "landmark case."

"The Court correctly recognized the narrow posture of these cases and declined to rewrite a key tenet of U.S. Internet law, preserving free expression online and a thriving digital economy. No one wants to see extremist content on digital services — especially the services themselves, which are constantly vetting millions of pieces of content in real time to promote trust and safety and protect users, consistent with their terms of service.”

ORGANIZATIONS IN THIS STORY

More News