TOPICS·TECHNOLOGY

Section 230

47 U.S.C. § 230 — Communications Decency Act of 1996

Twenty-six words that created the internet: the legal foundation shielding online platforms from liability for user-generated content and empowering voluntary content moderation. Under unprecedented political and technological pressure.

US FEDERALIN FORCE SINCE 1996332 regulations trackedUpdated April 2026
THE ESSENTIALS

If you have ever posted a photo on social media, left a product review, or commented on a news article, Section 230 was quietly working in the background. This short provision of US law is the reason platforms like Facebook, YouTube, and Reddit can host billions of posts without being sued for every one of them. Without it, most of the internet as we know it could not exist.

Here is the basic idea: if someone writes something harmful online, the person who wrote it can be held responsible -- but the website that hosted it generally cannot. This is different from traditional publishing. A newspaper that prints a defamatory letter can be sued alongside the letter's author. A website that lets users post the same letter typically cannot. Section 230 draws that line. The law also encourages platforms to clean up their spaces -- if a social media company removes a post it considers harmful, it will not be punished for that decision.

Today, that bargain is under strain from all sides. Critics on the right say platforms use their moderation power to silence conservative voices. Critics on the left say platforms profit from amplifying harmful content while hiding behind immunity. Courts, Congress, state legislatures, and foreign governments are all pushing for change. The Supreme Court had an opportunity to narrow Section 230 in Gonzalez v. Google (2023) but declined to rule on the question, leaving the broad interpretation intact.

The rise of AI-generated content adds an entirely new dimension that the law's 1996 authors never imagined. When a platform's own AI system produces harmful content, it is no longer merely hosting someone else's speech -- it may be the speaker. Whether Section 230 can stretch to cover this scenario is one of the defining legal questions of the next decade.

CHSWISS COMPASS

Section 230 shapes the global internet. Any company that operates a website, app, or digital service accessible to US users is affected by how American courts interpret platform liability. Swiss companies hosting user-generated content, running online marketplaces, or deploying AI chatbots need to understand where Section 230's protections apply -- and where the EU's Digital Services Act imposes stricter obligations.

Switzerland has no domestic equivalent of Section 230. Swiss intermediary liability is governed by general tort law principles and the Federal Supreme Court's case law, which imposes a duty to act once a host has knowledge of unlawful content. Companies operating across both US and EU markets face a fundamental divergence: broad immunity in America versus conditional protection tied to active compliance obligations in Europe.

WHAT
US federal law providing legal immunity to online platforms for third-party content and for good-faith content moderation decisions.
WHO
All interactive computer services -- websites, social media, app stores, cloud platforms, search engines -- that host or moderate user-generated content.
WHEN
Enacted 1996 as part of the Communications Decency Act. Repeatedly upheld by courts. Multiple reform bills proposed every Congress since 2019.
PENALTY
No direct penalties. Section 230 is a shield, not a sword -- it provides immunity from liability rather than imposing obligations or fines.

Section 230 provides two distinct protections. Understanding which shield applies -- and where each has limits -- is essential for any platform operator.

Key Supreme Court decisions that have shaped -- or pointedly avoided shaping -- Section 230's scope.

Dozens of bills to amend or repeal Section 230 have been introduced since 2019. These are the most significant active proposals in the current Congress.

POLITICAL LANDSCAPE (2025-2026)
Section 230 reform has bipartisan support but for opposing reasons. Republicans argue platforms censor conservative speech and want to limit moderation discretion. Democrats argue platforms amplify harmful content and want to create accountability for algorithmic harms. This ideological split has prevented any reform bill from passing both chambers. The Supreme Court's reluctance to narrow Section 230 in Gonzalez v. Google has increased pressure on Congress to act legislatively.

While Section 230 is federal law, states are increasingly passing their own platform regulation. Many face legal challenges on First Amendment and preemption grounds.

TXTexasIN LITIGATION
HB 20 (2021)
Anti-censorship: prohibits platforms from banning users based on viewpoint
Partially remanded by Supreme Court in Moody v. NetChoice (2024). Fifth Circuit must re-analyse. Core anti-moderation provisions remain contested.
FLFloridaIN LITIGATION
SB 7072 (2021)
Anti-censorship: restricts platform moderation of political candidates and media
Partially remanded by Supreme Court in Moody v. NetChoice (2024). Eleventh Circuit must re-analyse. The law's broad scope was a key concern.
CACaliforniaENJOINED
AB 2273 (AADC) (2022)
Age-appropriate design for children: data protection impact assessments, default privacy settings
Preliminary injunction granted (NetChoice v. Bonta). Ninth Circuit heard oral arguments in 2024. Modelled on UK Age Appropriate Design Code.
UTUtahENJOINED
SB 152 (2023)
Parental consent for minor social media accounts; age verification
Blocked by federal court on First Amendment grounds. NetChoice v. Comer. Amendments passed in 2024 session.
MTMontanaENJOINED
SB 419 (2023)
Banned TikTok statewide (first-of-its-kind)
Permanently enjoined by federal judge as unconstitutional First Amendment violation. Never took effect.
NYNew YorkENACTED
SAFE for Kids Act (2024)
Restricts algorithmic feeds for minors without parental consent
Signed into law June 2024. Restricts addictive algorithmic feeds for minors. Enforcement begins 2025. Litigation expected.
CACaliforniaIN EFFECT
AB 3080 (AI Transparency) (2024)
Requires disclosure of AI-generated content in political advertising and media
Requires labels on AI-generated election content. Intersects with Section 230 questions about platform liability for synthetic media.
LALouisianaIN EFFECT
Act 440 (2022)
Age verification for pornographic websites
First state age-verification law to take effect. Survived initial legal challenges. Model for 15+ similar state bills.
In Effect In Litigation Enjoined Enacted

The rise of generative AI creates novel questions that Section 230's 1996 text never anticipated. These are the key open issues.

The world's two largest digital markets take fundamentally different approaches to platform liability. Section 230 grants broad immunity; the DSA conditions limited protection on active compliance.

DIMENSIONUS SECTION 230 EU DIGITAL SERVICES ACT
Liability frameworkBroad unconditional immunity for third-party content. No knowledge standard required for (c)(1) protection.Conditional exemption (Articles 4-6): must act expeditiously upon obtaining actual knowledge of illegal content.
Content moderationVoluntary. Platforms may moderate in good faith without liability. No obligation to monitor or remove content.Mandatory notice-and-action system. Must provide clear mechanisms for reporting illegal content. Reasoned explanations for removals.
TransparencyNo transparency requirements. Platforms may operate content policies without disclosure.Extensive transparency reports required. Algorithmic transparency for very large platforms. Annual audits.
Risk assessmentNo risk assessment obligation.Very large platforms must conduct annual systemic risk assessments covering illegal content, fundamental rights, elections, and public health.
EnforcementPrivate litigation. No federal regulator. FTC has general consumer protection authority but no Section 230-specific enforcement.Digital Services Coordinators in each Member State. European Commission as direct supervisor for very large platforms. Fines up to 6% of global turnover.
ScopeAny "interactive computer service." No size thresholds. Same rules for a personal blog and a platform with 3 billion users.Tiered obligations by size: intermediary services, hosting services, online platforms, and very large online platforms (45M+ EU users).
Algorithm regulationNo algorithmic regulation. Courts have consistently held that algorithmic curation is protected activity.VLOPs must offer a non-profiling-based recommendation option. Must assess systemic risks from recommender systems. Researchers granted data access.
Children and minorsNo specific minor protection provisions. Separate laws (COPPA) address children's data. KOSA pending.Platforms must take measures to ensure high level of privacy, safety, and security for minors. Targeted advertising to minors prohibited.
PRACTICAL IMPLICATION
Companies operating in both the US and EU must maintain two distinct content governance frameworks: a lighter-touch approach relying on Section 230's broad immunity in the US, and a comprehensive compliance program meeting the DSA's mandatory obligations in the EU. Many multiplatform companies are converging toward DSA-level compliance globally to simplify operations -- effectively making the EU's stricter regime a de facto global standard.

Select your company type for tailored guidance on Section 230's relevance to your operations.

KEY CONSIDERATIONS
Benefit from Section 230 immunity for user-generated content on your platform
Maintain good-faith content moderation policies to preserve immunity
Understand exceptions: federal criminal law, intellectual property, FOSTA-SESTA
Monitor legislative reform proposals that could narrow protections
YOUR FIRST STEP

Document your content moderation policies and practices to demonstrate good-faith enforcement in case of legal challenge

01
Publisher immunity
Platforms are not treated as publishers of third-party content and cannot be held liable for user posts.
02
Good faith moderation
Platforms may moderate content in good faith without losing their immunity protections.
03
Federal vs. state law
Section 230 preempts inconsistent state laws; platforms must track evolving state-level carveouts and reform proposals.
04
Exclusions from immunity
Section 230 does not protect against federal criminal law, intellectual property claims, or FOSTA/SESTA violations.
05
Content policy documentation
Maintain and publish clear content moderation policies to support good-faith moderation defence.
REGULATIONS332
US FEDERAL332
EU (RELATED)0
COURT RULINGS0
VIEW ALL →
JUR.TITLESTATUSLINKS
USNational Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery Stacks, and Coke Oven Batteries; Rescission of Extension of Compliance Deadlines for Coke Oven FacilitiesIn Force0
USProtecting Against National Security Threats to the Communications Supply Chain Through the Equipment Authorization ProgramProposed0
USProtecting Against National Security Threats to the Communications Supply Chain Through the Equipment Authorization ProgramIn Force0
USDelete, Delete, Delete; Targeting and Eliminating Unlawful Text Messages; Rules and Regulations Implementing of the Telephone Consumer Protection Act of 1991; Advanced Methods To Target and Eliminate Unlawful RobocallsIn Force0
USTechnical Correction: Extension of Deadlines: Standards of Performance for New, Reconstructed, and Modified Sources and Emissions Guidelines for Existing Sources: Oil and Natural Gas Sector Climate Review Interim Final Rule; National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel Manufacturing Facilities; National Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery Stacks, and Coke Oven BatteriesIn Force0
USExtension of Deadlines: Standards of Performance for New, Reconstructed, and Modified Sources and Emissions Guidelines for Existing Sources: Oil and Natural Gas Sector Climate Review Interim Final Rule; National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel Manufacturing Facilities; National Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery Stacks, and Coke Oven BatteriesIn Force0
USDelete, Delete, Delete; Safeguarding and Securing the Open Internet; Restoring Internet Freedom; Implementation of the Local Competition Provisions in the Telecommunications Act of 1996; Interconnection Between Local Exchange Carriers and Commercial Mobile Radio Service ProvidersIn Force0
USNational Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery Stacks, and Coke Oven Batteries; Residual Risk and Technology Review, and Periodic Technology ReviewIn Force0
USWireline Competition Bureau Seeks To Refresh Record on Telephone Access ChargesProposed0
USEnergy Conservation Program: Rescinding the Efficiency Standards for Battery ChargersProposed0
VIEW ALL →
DATEJUR.TITLESTATUS
Dec 5, 2025USNational Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery Stacks, and Coke Oven Batteries; Rescission of Extension of Compliance Deadlines for Coke Oven FacilitiesIn Force
Dec 4, 2025USProtecting Against National Security Threats to the Communications Supply Chain Through the Equipment Authorization ProgramProposed
Nov 25, 2025USProtecting Against National Security Threats to the Communications Supply Chain Through the Equipment Authorization ProgramIn Force
Aug 29, 2025USDelete, Delete, Delete; Targeting and Eliminating Unlawful Text Messages; Rules and Regulations Implementing of the Telephone Consumer Protection Act of 1991; Advanced Methods To Target and Eliminate Unlawful RobocallsIn Force
Aug 22, 2025USTechnical Correction: Extension of Deadlines: Standards of Performance for New, Reconstructed, and Modified Sources and Emissions Guidelines for Existing Sources: Oil and Natural Gas Sector Climate Review Interim Final Rule; National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel Manufacturing Facilities; National Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery Stacks, and Coke Oven BatteriesIn Force
Aug 15, 2025USExtension of Deadlines: Standards of Performance for New, Reconstructed, and Modified Sources and Emissions Guidelines for Existing Sources: Oil and Natural Gas Sector Climate Review Interim Final Rule; National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel Manufacturing Facilities; National Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery Stacks, and Coke Oven BatteriesIn Force
Aug 8, 2025USDelete, Delete, Delete; Safeguarding and Securing the Open Internet; Restoring Internet Freedom; Implementation of the Local Competition Provisions in the Telecommunications Act of 1996; Interconnection Between Local Exchange Carriers and Commercial Mobile Radio Service ProvidersIn Force
Jul 8, 2025USNational Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery Stacks, and Coke Oven Batteries; Residual Risk and Technology Review, and Periodic Technology ReviewIn Force
Jul 7, 2025USWireline Competition Bureau Seeks To Refresh Record on Telephone Access ChargesProposed
May 16, 2025USEnergy Conservation Program: Rescinding the Efficiency Standards for Battery ChargersProposed
Apr 1, 2025USProtect Victims of Digital Exploitation and Manipulation Act of 2025Proposed
Mar 24, 2025USReducing Barriers for Broadband on Federal Lands Act of 2025Proposed
Mar 6, 2025USEmerging Digital Identity Ecosystem Report Act of 2025Proposed
Mar 4, 2025USBroadband Internet for Small Ports ActProposed
Feb 21, 2025USAccessibility of User Interfaces, and Video Programming Guides and MenusIn Force