What Section 230 Changes Could Mean for the Internet

Brian Matthew

Two cases are being heard by the Supreme Court this week that could have a major impact on the future of the internet. Both cases involve Section 230, and the rulings may shift liability for online platforms and recommendation engines, forever changing social media and the internet as we know it today.

What Is Section 230?

Section 230 of the Communications Decency Act is a federal law in the United States that provides immunity to online platforms for user-generated content. Specifically, Section 230 protects online platforms, including social media sites, from being held liable for the content posted by users. This means that online platforms cannot be sued for defamatory, offensive, or otherwise harmful content posted by users on their platform.

In addition to protecting platforms from liability for user-generated content, Section 230 also provides immunity for platforms' decisions to moderate content, including removing or restricting content that is deemed harmful or inappropriate. This provision is intended to encourage online platforms to moderate content and remove harmful material without fear of being sued for doing so.

Section 230 has been credited with enabling the growth of the internet and social media, by providing platforms with legal protection to allow users to post their own content without fear of the platform being held responsible for it. However, it has also been criticized by some for allowing platforms to avoid responsibility for harmful content that is posted on their sites.

Google's Section 230 Supreme Court Case

https://img.particlenews.com/image.php?url=1b9t7t_0ku3CklZ00
Google logo stylized in modern art style.Photo byBrian Penny

Gonzalez v. Google (18-16700) is a lawsuit brought by the family of a woman who was killed in a terrorist attack in 2015. The family alleges that Google's YouTube platform recommended ISIS videos that were used to recruit terrorists who carried out the attack. The lawsuit argues that YouTube's use of algorithms to recommend content makes the company liable for the harm caused by that content, and that the federal liability shield for tech platforms, known as Section 230 of the Communications Decency Act, does not protect YouTube in this case.

The tech industry is closely watching the case, as a ruling against YouTube could make companies liable for harmful content recommended by their algorithms. This could have far-reaching consequences for social media, e-commerce, and job portals, among other sectors.

The Supreme Court's ruling in the case will also be the court's first look at Section 230, a law that has come under increasing scrutiny in recent years as social media platforms have faced criticism for the harmful content they host. The court's decision could reshape the online ecosystem and has the potential to affect free speech and innovation.

The case is also notable for the split among Republicans, with some lawmakers calling for Section 230 to be narrowed to protect free speech, while others argue that it gives large tech companies too much power over which speech is allowed on their platforms.

The Supreme Court is expected to rule on the Gonzalez case in tandem with another case, Twitter v. Taamneh, which asks whether social media platforms can be held liable for aiding and abetting terrorism by allowing the posting of ISIS recruitment content.

Twitter and Meta's Supreme Court Section 230 Case

https://img.particlenews.com/image.php?url=3FIw1P_0ku3CklZ00
Stylized Twitter blue bird logo breaking through a brick wall.Photo byBrian Penny

Taamneh v. Twitter will be heard by the US Supreme Court on February 22, 2023. The plaintiffs, relatives of a victim of the 2017 Reina nightclub attack in Istanbul, allege that Google, Twitter, and Facebook played a critical role in ISIS's growth and provided "knowing and substantial assistance" to the terrorist group by failing to remove accounts and content associated with the group from their platforms.

They also claim that Google shared revenue with ISIS through its AdSense program. The case could determine whether online platforms can be held civilly liable for providing aid and support to terrorist organizations and could have far-reaching implications for content moderation and revenue-sharing practices. It focuses on secondary aiding and abetting liability under the Anti-Terrorism Act of 1990 and asks whether platforms can be held liable if their services were not used in connection with a specific act of international terrorism.

Like the previous case, Tammneh v Twitter could transform current content moderation practices, with companies opting for overaggressive moderation methods that could chill free speech. The case could also lead to future legislation using secondary liability claims from the ATA as a framework.

The court's decision in these cases could have implications for other tech-related cases, including challenges to laws in Texas and Florida that ban platforms from removing users' viewpoints and deplatforming candidates.

This is original content from NewsBreak’s Creator Program. Join today to publish and share your own content.

Comments / 0

Published by

Freelance journalist and blogger focused on the intersection between technology, business, and culture. His work can be found in High Times, Jim Cramer's The Street, and Forbes. Always keeping an eye out for newsworthy stories...

Las Vegas, NV
83 followers

More from Brian Matthew

Comments / 0