Section 230 helps us connect with our families, our communities, our customers, our small businesses, and turns us all into creators. Without it, the internet by the people, of the people, for the people, disappears, taking hundreds of online communities with it.
But what exactly would that mean?
Without Section 230, platforms couldn’t moderate without risking legal liability. That would leave them with a few basic options: write it all themselves and prevent user interaction, strictly limited user-generated content so all we see is Baby Shark and toddler-friendly content, or anything goes including the darkest, most despicable pieces usually reserved for 8-chan.
Spam & Bots:
Section 230 empowers companies to create spam-filters and blocks for trolls and otherwise false content. Without these assurances, platforms likely wouldn’t remove bot accounts and spam for fear of lawsuits for “heavy-handed” regulation.
Section 230 gives platforms the security to choose what they let on their site and how family-friendly they want to be. Everything from Facebook’s policies against nudity to the existence of YouTube kids relies on Section 230’s ability to filter out adult, obscene content. Without Section 230, online services that continue to host content won’t risk moderating it, leading to pornography and other inappropriate content proliferating online.
Because platforms probably wouldn’t moderate content, these platforms would lose the very mechanisms they’ve built to filter through their content. Social media provides millions of pieces of evidence to law enforcement each year thanks to their content moderation guidelines. Without Section 230, law enforcement would lose a valuable tool in stopping crimes and saving lives.