March 2023 US Policy Update
The Shield and Sword of Online Speech: Debating the future of Section 230
On March 23, the Developers Alliance, along with allies from Tripadvisor and the Franklin Square Group, held a webinar to discuss Section 230 of the Communications Decency Act – the law that created the internet. The panel covered the law’s background, what it means for devs and internet users, the prospects for reform in Congress, and even whether the Supreme Court will upend it. The panelists were in agreement that the law is a bedrock for the modern day internet, and is unlikely to be altered in Congress due in large part to a deep partisan divide on the issue. See highlights of the panel discussion here.
FAQs: A Primer for Section 230
Section 230 can be a confusing and tangled issue to understand. The Developers Alliance pulled together a list of frequently asked questions that devs may have on the topic. The best part? The answers are written in plain English – none of the legalese that makes so much of what’s written in Washington unreadable. These FAQs are designed to help devs and their users understand the ins and outs of Section 230 and the landscape for reforms in Washington. Learn more here.
Senate Panel Holds Content Moderation Hearing, Alliance Weighs In
You could argue that some of the most important words in society today are found in a relatively obscure law crafted in 1996. That was the year Congress passed the Communications Decency Act (you’ll be forgiven if you were otherwise consumed with the Macarena, cloning, or Michael Johnson). The Communications Decency Act included what are known as the “26 words that created the internet.”
Fast forward to 2023, and we find a Congress laser focused on reforming the law. Congress’ interests in reforms are primarily centered around the idea that platforms should be prohibited from 1) hosting dangerous content online, and 2) censoring viewpoints they do not agree with. Whether or not lawmakers can find agreement to pass Section 230 reforms this Congress, committees are full steam ahead in holding hearings to examine the topic.
The Senate Judiciary Subcommittee on Privacy, Technology, and the Law held a hearing titled “Platform Accountability: Gonzalez and Reform” in which Senators explored the real world effects of reforming Section 230. The hearing was especially troubling in that just one of the five witnesses had any tangible experiences on what Section 230 would mean for the internet. Andrew Sullivan, President and CEO of the Internet Society was steadfast in his defense of the law, correctly pointing out that an outright repeal of Section 230 would greatly restrict the flow of free speech as platforms would limit what can and cannot be seen by internet users for fear of being on the wrong side of a frivolous lawsuit. He also noted that scrapping Section 230 would do irreparable harm to small- and medium-sized firms, as only the largest companies with legions of attorneys on staff would be able to navigate the new landscape. What was left unsaid, notably by the witness hoping to do away with Section 230, is the idea that content creators should be held responsible for what they create.
For our part, the Developers Alliance joined a letter signed by more than three dozen other industry associations, civil society organizations, and internet experts. The letter detailed our belief in the internet as a tool to build communities, lift up marginalized groups, hold the powerful accountable, and to connect and learn. Our letter was sent to subcommittee members ahead of the hearing and was entered into the hearing record.
House Panel Holds Hearing on Big Tech Censorship
The House Communications and Technology Subcommittee held a hearing titled “Preserving Free Speech and Reining in Big Tech Censorship” on March 28. The hearing was an opportunity for the other half of Congress to discuss Section 230 and it largely fell along party lines – Republicans on the Subcommittee claimed platforms are engaging in the censorship of conservative voices, while Democrats said Section 230 is not doing enough to curb hate speech, misinformation, disinformation, and other dangerous content. Nearly every member of the Subcommittee agreed on two points: 1) Section 230 needs to be updated to reflect today’s digital environment; and 2) the courts’ interpretation of Section 230 to give blanket liability protections to platforms is wrong. Whether or not Congress finds any path forward to address these issues remains to be seen. Given that during this hearing members of Congress were unable to even agree on the problems Section 230 is or isn’t creating, it’s probably unlikely at this point that any action will be taken. The Supreme Court will hand down its decisions in the Taamneh and Gonzales cases this summer. You can read the Alliance’s brief on the Gonzales case here and Taamneh case here.
Federal Trade Commission Issues Guidance on Generative AI
On March 20, the FTC published a blog post in which they made clear that in their opinion the FTC Act could be applied to generative AI if it is designed to deceive consumers, even if that isn’t its intended use. The FTC outlined four questions innovators should ask themselves before creating generative AI:
- Should you even be making or selling it?
- Are you effectively mitigating the risks?
- Are you over-relying on post-release detection?
- Are you misleading people about what they’re seeing, hearing, or reading?
Expect more thoughts from the FTC and other regulators as these technologies are deployed on a more widespread scale.