*The views expressed in this article do not represent the views of Santa Clara University.
Credit: Adem Ay | Unsplash
Section 1: What is Section 230?
Section 230 of the Communications Decency Act, passed in 1996, shields certain tech companies from liability for content posted to their platforms by users. Many consider Section 230 as the most important law protecting online free speech. Without it, crushing liability would force companies like Meta, YouTube, and Canvas to severely restrict user-generated content via censorship or completely forgo all content moderation.
The code states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (47 US Code § 230 (c)(1)). For example, in Anderson v. TikTok, plaintiff sued for her daughter’s death resulting from participation in the “Blackout Challenge” which she claims was promoted by TikTok’s algorithm. TikTok and its algorithm distribute and promote user generated content, a “quintessential publisher function.” However, the court ultimately held that TikTok was not a publisher. Although Section 230 might permit objectionable content, like the blackout challenge, it does not preclude tech companies from setting their own boundaries.
Section 230 actually protects tech companies that seek to limit objectionable content whether or not it is constitutionally protected speech. This allows YouTube to remove terrorist videos and Twitter to remove Donald Trump without being liable to users whose content and accounts are banned. Moreover, Section 230 does not allow users to post illegal content, but it will protect them if they merely forward an email or retweet content that is subsequently deemed unlawful.
How its currently being challenged
The Supreme Court has agreed to hear a case that could change the way companies have used Section 230 of the CDA. The case, Gonzalez v. Google, LLC, arose from the death of Nohemie Gonzales, who was killed while studying abroad in Paris during the 2015 Paris Attacks, and has been appealed up to the Supreme Court. The suit the Gonzalez family brought against Google was originally alleging that by failing to remove content posted by ISIS on YouTube, one of Google’s properties, that Google was essential in the increase in support for ISIS, making Google indirectly responsible for the 2015 attacks.
The traditional editorial functions stated by the appeal is the decision by an interactive computer service, such as Google, to display information on their website or remove it. In their appeal, Gonzalez argues that the protections Section 230 granted to traditional editorial functions apply to recommended content pushed to viewers on platforms such as Youtube, Google, or Instagram. Gonzalez v. Google gives the Supreme Court a unique opportunity to determine whether or not companies like Google, Twitter, or other online media platforms can be held liable for recommended content suggested to users.
Implications on big tech if Section 230 is overturned
Section 230 has played a major role in shaping the way the Internet has grown and developed into what it is today, and overturning the statute would cause major changes in how we navigate the online world. Many of today’s most popular platforms rely on Section 230 to function, and would likely have to shut down or make
drastic changes to how they operate in order to stay up.
Under Section 230, content-hosting platforms like YouTube, Instagram, and TikTok don’t have to face liability for content that is posted on their sites. Instead, the content creators are liable for their own content. This burden-shifting tends to upset critics of Section 230 that want to repeal the law because it means it makes it extremely difficult to go after the platform in the event of damages. However, without Section 230’s liability shield, these platforms would have a much more difficult time functioning, and new platforms would face insurmountable obstacles to launch.
For existing platforms, repealing Section 230 would mean that the platforms themselves would be responsible for all of the content they host. For context, YouTube has about 700,000 hours of video uploaded to the platform every day. That is an impossibly large amount of content even for a company as large as YouTube to sift through every day. If platforms like YouTube were made liable for the content that they host, YouTube, and other large social media companies like it, would have to find a way to go through all of that content before it is actually made public to ensure that none of it could cause legal issues for them. Any attempt to do so would have to mean large delays in content being made public, and likely some form of limitation on who can be a user. In order to support the cost of content moderation that strict, platforms would likely have to become paid subscription services instead of the open platforms they are today. The beauty of our current social media platforms is that everyone is able to join in as long as they have an internet connection. It allows people to speak freely and connect with others in ways that are just not offered outside of the internet. Charging for access to these platforms will hurt the voices of the vulnerable people who would not be able to afford the subscription fees.
Additionally, repealing Section 230 would make it nearly impossible for new social media platforms to be made. If Section 230 is repealed, the content moderation burden placed on social media platforms would be so immense that even tech giants like Google or Meta would struggle to keep up and be forced to make major changes. Under this burden, new social media platforms barely stand a chance. At startups, resources are almost always slim. New social media platforms in a post-230 world would simply not have the resources to get a compliant content moderation process off the ground, and new platforms would be prevented from even forming due to the massive hurdles to become profitable. If critics dislike Section 230 because they dislike large tech companies, a world without it would only solidify their hold on the market by being the only entities with the resources to possibly stay afloat.
While Section 230 might not be a perfect solution, it fosters innovation and allows the Internet to be a place where people can share their knowledge and experiences, and still holds platforms liable for their own actions. Without it, the Internet would be a very different place.
Helpful explainer: The Verge: Why the most important internet law is being rewritten
Comentários