|
Post by the Scribe on Dec 3, 2023 20:52:28 GMT
Part of the Infodemic Problem Section 230 is a section of Title 47 of the United States Code that was enacted as part of the Communications Decency Act of 1996, which is Title V of the Telecommunications Act of 1996, and generally provides immunity for online computer services with respect to third-party content generated by its users. At its core, Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
en.wikipedia.org/wiki/Section_230Section 230(c)(2) further provides "Good Samaritan" protection from civil liability for operators of interactive computer services in the good faith removal or moderation of third-party material they deem "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected."
Section 230 was developed in response to a pair of lawsuits against online discussion platforms in the early 1990s that resulted in different interpretations of whether the service providers should be treated as publishers or, alternatively, as distributors of content created by their users. Its authors, Representatives Christopher Cox and Ron Wyden, believed interactive computer services should be treated as distributors, not liable for the content they distributed, as a means to protect the growing Internet at the time.
Section 230 was enacted as part of the Communications Decency Act (CDA) of 1996 (a common name for Title V of the Telecommunications Act of 1996), formally codified as part of the Communications Act of 1934 at 47 U.S.C. § 230. After passage of the Telecommunications Act, the CDA was challenged in courts and was ruled by the Supreme Court in Reno v. American Civil Liberties Union (1997) to be unconstitutional, though Section 230 was determined to be severable from the rest of the legislation and remained in place. Since then, several legal challenges have validated the constitutionality of Section 230.
Section 230 protections are not limitless and require providers to remove material illegal on a federal level, such as in copyright infringement cases. In 2018, Section 230 was amended by the Stop Enabling Sex Traffickers Act (FOSTA-SESTA) to require the removal of material violating federal and state sex trafficking laws. In the following years, protections from Section 230 have come under more scrutiny on issues related to hate speech and ideological biases in relation to the power that technology companies can hold on political discussions and became a major issue during the 2020 United States presidential election, especially with regard to alleged censorship of more conservative viewpoints on social media.
Passed when Internet use was just starting to expand in both breadth of services and range of consumers in the United States,[2] Section 230 has frequently been referred to as a key law, which allowed the Internet to develop.[3]
|
|
|
Post by the Scribe on Dec 3, 2023 20:55:23 GMT
Application and limits
Section 230 has two primary parts both listed under §230(c) as the "Good Samaritan" portion of the law. Under section 230(c)(1), as identified above, an information service provider shall not be treated as a "publisher or speaker" of information from another provider. Section 230(c)(2) provides immunity from civil liabilities for information service providers that remove or restrict content from their services they deem "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected", as long as they act "in good faith" in this action.
In analyzing the availability of the immunity offered by Section 230, courts generally apply a three-prong test. A defendant must satisfy each of the three prongs to gain the benefit of the immunity:[4]
The defendant must be a "provider or user" of an "interactive computer service". The cause of action asserted by the plaintiff must treat the defendant as the "publisher or speaker" of the harmful information at issue. The information must be "provided by another information content provider", i.e., the defendant must not be the "information content provider" of the harmful information at issue.
Section 230 immunity is not unlimited. The statute specifically excepts federal criminal liability (§230(e)(1)), electronic privacy violations (§230(e)(4)) and intellectual property claims (§230(e)(2)).[5] There is also no immunity from state laws that are consistent with 230(e)(3) though state criminal laws have been held preempted in cases such as Backpage.com, LLC v. McKenna[6] and Voicenet Communications, Inc. v. Corbett[7] (agreeing that "the plain language of the CDA provides ... immunity from inconsistent state criminal laws"). What constitutes "publishing" under the CDA is somewhat narrowly defined by the courts. The Ninth Circuit held that "Publication involves reviewing, editing, and deciding whether to publish or to withdraw from publication third-party content."[8] Thus, the CDA does not provide immunity with respect to content that an interactive service provider creates or develops entirely by themselves.[9][10] CDA immunity also does not bar an action based on promissory estoppel.[11][12] As of mid-2016, courts have issued conflicting decisions regarding the scope of the intellectual property exclusion set forth in §230(e)(2). For example, in Perfect 10, Inc. v. CCBill, LLC,[13] the 9th Circuit Court of Appeals ruled that the exception for intellectual property law applies only to federal intellectual property claims such as copyright infringement, trademark infringement, and patents, reversing a district court ruling that the exception applies to state-law right of publicity claims.[14] The 9th Circuit's decision in Perfect 10 conflicts with conclusions from other courts including Doe v. Friendfinder. The Friendfinder court specifically discussed and rejected the lower court's reading of "intellectual property law" in CCBill and held that the immunity does not reach state right of publicity claims.[15]
Two bills passed since the passage of Section 230 have added further limits to its protections. The Digital Millennium Copyright Act in 1998, service providers must comply with additional requirements for copyright infringement to maintain safe harbor protections from liability, as defined in the DMCA's Title II, Online Copyright Infringement Liability Limitation Act.[16] The Stop Enabling Sex Traffickers Act (the FOSTA-SESTA act) of 2018 eliminated the safe harbor for service providers in relationship to federal and state sex trafficking laws.
|
|
|
Post by the Scribe on Dec 3, 2023 20:56:19 GMT
Background and passage
Prior to the Internet, case law was clear that a liability line was drawn between publishers of content and distributors of content; a publisher would be expected to have awareness of material it was publishing and thus should be held liable for any illegal content it published, while a distributor would likely not be aware and thus would be immune. This was established in the 1959 case, Smith v. California,[17] where the Supreme Court ruled that putting liability on the provider (a book store in this case) would have "a collateral effect of inhibiting the freedom of expression, by making the individual the more reluctant to exercise it."[18]
In the early 1990s, the Internet became more widely adopted and created means for users to engage in forums and other user-generated content. While this helped to expand the use of the Internet, it also resulted in a number of legal cases putting service providers at fault for the content generated by its users. This concern was raised by legal challenges against CompuServe and Prodigy, which were early service providers at that time.[19] CompuServe stated it would not attempt to regulate what users posted on its services, while Prodigy had employed a team of moderators to validate content. Both companies faced legal challenges related to content posted by their users. In Cubby, Inc. v. CompuServe Inc., CompuServe was found not to be at fault as, by its stance as allowing all content to go unmoderated, it was a distributor and thus not liable for libelous content posted by users. However, in Stratton Oakmont, Inc. v. Prodigy Services Co., the court concluded that because Prodigy had taken an editorial role with regard to customer content, it was a publisher and was legally responsible for libel committed by its customers.[20]
Chris Cox Ron Wyden
Chris Cox (left) and Ron Wyden, the framers of Section 230 Service providers made their Congresspersons aware of these cases, believing that if followed by other courts across the nation, the cases would stifle the growth of the Internet.[21] United States Representative Christopher Cox (R-CA) had read an article about the two cases and felt the decisions were backwards. "It struck me that if that rule was going to take hold then the internet would become the Wild West and nobody would have any incentive to keep the internet civil," Cox stated.[22]
At the time, Congress was preparing the Communications Decency Act (CDA), part of the omnibus Telecommunications Act of 1996, which was designed to make knowingly sending indecent or obscene material to minors a criminal offense. A version of the CDA had passed through the Senate pushed by Senator J. James Exon (D-NE).[23] People in a grassroots effort in the tech industry reacted to try to convince the House of Representatives to challenge Exon's bill. Based on the Stratton Oakmont decision, Congress recognized that requiring service providers to block indecent content would make them be treated as publishers in the context of the First Amendment, and thus would make them become liable for other content such as libel, not set out in the existing CDA.[19] Cox and fellow Representative Ron Wyden (D-OR) wrote the House bill's section 509, titled the Internet Freedom and Family Empowerment Act, designed to override the decision from Stratton Oakmont, so that a service provider could moderate content as necessary and would not have to act as a wholly neutral conduit. The new provision was added to the text of the proposed statute while the CDA was in conference within the House.
The overall Telecommunications Act, with both Exon's CDA and Cox/Wyden's provision, passed both Houses by near-unanimous votes and was signed into law by President Bill Clinton by February 1996.[24] Cox/Wyden's section became Section 509 of the Telecommunications Act of 1996 and became law as a new Section 230 of the Communications Act of 1934. The anti-indecency portion of the CDA was immediately challenged on passage, resulting in the Supreme Court 1997 case, Reno v. American Civil Liberties Union, that ruled all of the anti-indecency sections of the CDA were unconstitutional, but left Section 230, among other provisions of the Act, as law.[25]
|
|
|
Post by the Scribe on Dec 3, 2023 20:59:00 GMT
|
|
|
Post by the Scribe on Dec 3, 2023 21:00:09 GMT
It is still intact because it is the least "bad" option and alternative.KEY TAKEAWAYS itif.org/publications/2021/02/22/overview-section-230-what-it-why-it-was-created-and-what-it-has-achieved/
Section 230 includes two main provisions: one that protects online services and users from liability when they do not remove third-party content, and one that protects them from liability when they do remove content.
Section 230 is not a limitless legal shield. Online services are still liable for violating federal criminal or copyright law, or for violating federal or state sex-trafficking law.
Congress enacted Section 230 to remove legal barriers that would disincentivize online services from moderating content and to encourage continued growth and development of the Internet in its early years.
Section 230 has allowed a wide variety of business models to flourish online and is partly responsible for creating the modern Internet.
Without Section 230’s legal protections, online services would face large legal expenses that would be detrimental to competition, innovation, and the U.S. economy.
|
|
|
Post by the Scribe on Dec 3, 2023 21:07:08 GMT
What is Section 230 and why do people want it repealed?
60 Minutes
05,521 views Jan 3, 2021 Section 230 of the Communications Decency Act of 1996 protects internet platforms from liability for what users post on their sites. Scott Pelley reports on the ramifications of the legislation and why it could be repealed in the near future. cbsn.ws/3hBcVpC
What is Section 230?
CNBC Television
3,483 views Jan 26, 2021 #CNBCTV #CNBC Section 230 has long been a target for lawmakers on both sides of the aisle. Ad tech expert Mark Douglas explains what Section 230 is and why he's expecting to see it on the 2021 legislative agenda. For access to live and exclusive video from CNBC subscribe to CNBC PRO: cnb.cx/2NGeIvi
Section 230 of the Communications Decency Act has long been a target for lawmakers on both sides of the aisle and ad media expert Mark Douglas told CNBC’s “The News with Shepard Smith” that it will likely “be a big part of the 2021 legislative agenda.”
Rhode Island Governor Gina Raimondo informed lawmakers that she would explore changes to Section 230 if she was confirmed, during a hearing on her nomination for the position of Commerce Department secretary on Tuesday.
“I think platform accountability is important ... but of course, that reform would have to be balanced against the fact that these businesses rely upon user-generated content for their innovation, and they’ve created many thousands of jobs,” Raimondo said.
Former President Donald Trump slammed Section 230 during a speech in Dalton, Georgia on Jan 4. “We have to get rid of Section 230, we have to get rid of Section 230, or you are not going to have a country very long,” Trump said.
Douglas explained on “The News with Shepard Smith” that Section 230 protects social media sites, news sites in the comment section and essentially any site on the internet or mobile app where user-generated content exists.
“Section provides a provision where social media companies and other content providers on the internet can allow people to generate content while not being held legally liable for the words of their users,” said Douglas, the founder and CEO of adtech firm Steelhouse.
During an interview with the New York Times last January, President Joe Biden said, ”...Section 230 should be revoked, immediately should be revoked, number one. For Zuckerberg and other platforms.”
Douglas said that there are basically two sides to Section 230. One side includes free speech advocates who believe “that all social media sites and all protections should only apply essentially if there’s no content moderation.” Then there are those who believe the opposite, that Section 230 “should cover any form of content moderation, regardless of who originated those policies and who are putting those policies in effect.”
The adtech expert explained that reforming Section 230 will likely be focused on bridging the gap between content moderation and the First Amendment.
“There’s a big gap between the U.S. Constitution, Freedom of Speech and the First Amendment, and what websites and social media sites can do in terms of content moderation, and so as we look forward in terms of reforms for Section 230, they’re likely to be in that area of bridging the gap and allowing more free speech on the internet,” Douglas said.
|
|