FAQs: A Primer for Section 230

What is Section 230?

In 1996, Congress passed the Communications Decency Act. The Communications Decency Act’s Section 230 included the “26 words that created the internet”:

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

These 26 words provide internet platforms with liability protections from the content on their sites. Importantly, they have also helped the internet become nearly ubiquitous, making our lives more convenient, more productive, and more prosperous. 

Section 230 was created in response to a court case that  made internet companies responsible for everything their users posted. It was designed to choose between two evolutionary forks for the internet: one where user-generated content created liability for internet companies, and thus would be suppressed, and one where only users were responsible for their content, enabling internet companies to let user content proliferate. And it worked!

Section 230 created a new communication network where anyone could post, speak or create, and in turn anyone could access the speech, posts and creations of anyone else. This is possible because the internet can’t be sued for what you say or do – only you can – and you can’t be sued for what someone else says or does. Historically, if someone amplified your speech by publishing it in a book or broadcasting it, they were considered accountable for what you said just as you were if they knew what they were spreading. Their participation increased the harm because it spread the speech more widely, and they knew what it was so it was as if they just said it themselves. 

So it protects free speech?

To a degree. Because platforms aren’t held liable for content users post, they enable free expression and speech. However, platforms also have the right to take down harmful, misleading, or otherwise dangerous content. The First Amendment is the foundation upon which the platform’s own rights – to exclude or allow user speech – rest. And 230 protects users as well, for example by limiting the risk of being sued for retweeting, liking, or critiquing a product or service.

Nothing in Section 230 lifts liability from internet companies, users, or anyone else for what they say, post or create – it only insulates them from the speech of others. No one disagrees with this. It also enables anyone on the internet to police third-party speech on their sites without being sued for making the effort. It protects individuals when they tweet, retweet, post, or comment, just as much as it protects websites –  from the local church to Facebook.

How does section 230 relate to the First Amendment?

Section 230 is focused on third-party speech online. It insulates users, websites, apps and platforms from being sued for what others post. The First Amendment insures free speech by restricting how the government can regulate the ability to express your thoughts and ideas. This includes a newspaper or platform’s right to choose who and what is said through their outlet because the First Amendment protects the speech of companies and individuals alike.

Section 230 has no impact on a site’s ability to block or remove content; that’s ensured by the First Amendment. It does make it cheaper to defend those decisions in court. 230 reduces the legal risk of carrying third-party content. Without 230, the cost of litigation goes up considerably – enough that small platforms and users would be open to trolls using the threat of costly litigation to force sites to carry their speech or force the censorship of others. Without 230, third-party speech would likely be too expensive (in terms of litigation risks) to carry.

How does it help app developers?

In a word: certainty. Because of Section 230, developers know the rules of the road when it comes to hosting content on their platforms. This certainty ensures developers can do what they do best – create, innovate, and build. No developer signed up to serve as the thought police when they chose this line of work. 

There is no universally agreed standard for good or bad content, but rather a continuum unique to everyone and different across cultures and demographics. Our constitution warns Congress against setting such a standard. Section 230 is an inappropriate place to police online speech, as the unanticipated consequences of SESTA/FOSTA demonstrate. These measures have forced many platform operators to block or take down entire parts of their sites because policing them would be too onerous.

What does Washington want to do to change it?

It’s a mixed bag. Democrats and Republicans both want to go after Big Tech and Section 230, albeit with vastly different approaches. The Democrats generally want to reform Section 230 to ensure platforms are held liable for harmful and misleading content they host. In December 2020, President Biden went as far as saying, “Washington would be better off throwing out Section 230 and starting over.”

Republicans harbor similar sentiments, but with an added layer. Some in the GOP say Section 230 is only used as a shield to de-platform or censor conservative voices. In January, the Supreme Court delayed a decision to hear two critical cases challenging social media laws in Florida and Texas. The laws would prohibit platforms from taking down even the most dangerous or hateful content violating their First Amendment rights. In either case the result is simply more First Amendment litigation and as a result less freedom of speech online.

How might the Supreme Court impact Section 230?

In February 2023, the Supreme Court heard two cases that could reshape the internet. In both Gonzales v. Google and Twitter v. Taamneh the high court will determine whether online recommendations are covered by Section 230. Should the court rule that Section 230 should be reinterpreted to exclude online recommendations, we will likely see the floodgates fly open as Congress rushes to limit these liability protections even further. 

While the courts have a role to play in defending Section 230, there is simply no replacement for Congress’ part in crafting legislation around the issue. Congressmembers, not judges, are sent to Washington to find solutions to especially thorny issues like this one. 

**Update – The Supreme Court rejected lawsuits seeking to hold Google and Twitter liable for the content hosted on their platforms. In short, the high court punted, declining to overturn Section 230. This was the first time the Supreme Court dealt with cases related to Section 230. The issue will now head back to Congress. Given that the Supreme Court declined to act, expect efforts in Congress to reform the provision to ramp up in the coming months.

What does the app industry want to see done?

Section 230 is the bedrock of the internet. If policymakers determine that the best path forward is to reform Section 230, we remain hopeful that those reforms will be smart and measured. Policymakers should consult with developers of all stripes before any changes are made. A complete overhaul will toss the internet on its head and mean that even the smallest mom and pop app startups could be facing huge legal bills – as would individual users. 

Smart and measured changes that take into account small and medium-sized firms will ensure the United States remains the global leader in innovation. Alternatively, a complete overhaul will inevitably create a landscape in which only the largest multinational companies with legions of attorneys on staff can flourish. This landscape will stifle innovation, job creation, and economic growth. Frankly, the future of the internet is at stake. 

What about content creators?

Largely absent from any reform discussion are content creators. Content creators must be held accountable for the things they post and say online. Policymakers are quick to determine that platforms should be held accountable, but there are very few instances when these same policymakers hold content creators accountable.

This is a particularly challenging issue under the U.S. Constitution. While other countries regulate things like hate speech online, Congress is limited in what expression it can restrict. The tension between free speech, laws restricting speech, and laws restricting how internet companies must manage speech is real and challenging.

Avatar photo

By Geoff Lane

Policy Counsel & Head of US Policy Geoff Lane serves as the Developer Alliance’s head of U.S. policy. In this role he oversees the organization’s federal legislative and regulatory agenda as well as state-level efforts. Prior to joining the Developers Alliance in 2022, Geoff worked with senior Democratic leadership in the House of Representatives. Since his time on Capitol Hill, he has held senior roles at various technology trade associations (including a previous stint at the Developers Alliance). At each stop he led efforts at the intersection of innovation and policy. He has worked on critical policy issues including privacy, encryption, patent reform, workforce development, corporate tax, tax nexus, and research and development. Geoff holds a B.A. from Miami University in Oxford, Ohio. When he is not working, you can find him booing all of his favorite Philadelphia sports teams. Geoff is based in Washington, D.C.

Related Content

Developers Alliance files Amicus Brief to Argue that Algorithms are Protected by the First Amendment

Developers Alliance files Amicus Brief to Argue that Algorithms are Protected by the First Amendment

A Busy Regulatory End of the Year in Europe 

A Busy Regulatory End of the Year in Europe 

With Holidays Approaching, Outlook for Dev Policies is Limited on Capitol Hill

With Holidays Approaching, Outlook for Dev Policies is Limited on Capitol Hill

Join the Alliance. Protect your interests.

©2023 Developers Alliance All Rights Reserved.