Myths vs Facts of Section 230
Enacted with the Communications Decency Act in 1996, Section 230 provides our every day websites, from social media to digital marketplaces, the legal certainty that their website will not be held liable for the online activities of third parties.
Section 230 has created a robust Internet ecosystem where commerce, innovation, and free expression thrive — while enabling websites to take creative and aggressive steps in content moderation to protect us from bad actors.
Here’s a fact check about what rumors are right and wrong regarding Section 230:
"WEBSITES ARE PUBLISHERS, JUST LIKE NEWSPAPERS OR TV STATIONS. BECAUSE OF THIS, THEY ARE INVOLVED IN THE PROCURING AND CREATION OF THE CONTENT THEY HOST. THUS, THEY SHOULD BE HELD LIABLE."
Online platforms covered by Section 230 expressly do not have control over the content, it is created by third parties. These sites simply provide a place for the commentary, as if providing the newsprint but not a coherent newspaper.
A person does not have to work through an editor to have their Facebook post or tweet approved. These platforms create their own standards so they can create an environment where users want to engage and advertisers want to promote. Without Section 230, these websites would face too much liability to clean up the worst and most deplorable content. Removing objectionable content keeps the internet community-friendly, and Section 230 not only allows it but provides the proper incentives to encourage us all to behave as good samaritans.
"SECTION 230 WAS DESIGNED BY CONGRESS IN 1996 AS A 'SPECIAL FAVOR' FOR THE TECHNOLOGY INDUSTRY."
Section 230 enshrines our offline “Good Samaritan” rule for platforms. In the offline world, rendering aid to someone who is hurt does not make a person liable for any unintentional damage caused while rendering assistance. Now if platforms moderate the content on their platform to remove objectionable content, they are not liable.
Not just tech-- everyone with a website-- Washington post, newspapers, church message boards At the end of the day, Section 230 is not a “special deal for tech” but simply the transfer of analog laws to an online world.
"SECTION 230 WAS DESIGNED TO PROTECT AN INFANT INDUSTRY."
Former Congressman Cox and Senator Ron Wyden knew even back in the 1990's that the "internet was going to keep growing with each passing year" and that it was "important to let companies moderate whatever content they could without penalty".
Section 230 was conceived as a “safe harbor” to encourage website owners to police their sites as they would like, often cleaning up their sites to attract a broader audience and fight against all sorts of crime. In essence, all Section 230 does is make the speaker liable for their statements, not the platform on which the statements are made.
"SECTION 230 APPLIES ONLY TO SOCIAL MEDIA"
From their user reviews and comments sections to the very digital marketplaces themselves, Section 230 liability protections exist for all user-generate content on e-commerce platforms like Amazon and Etsy.
An essential purpose of Section 230 is aimed at removing the threat to e-commerce represented by the Prodigy lawsuit.