Oh, and algorithms are people too.
For some time now, two cases have been winding their way towards the U.S. Supreme Court. Both involve a social media site’s right to moderate and prioritize content in keeping with their published rules and policies. In both, the government has decided that what can and can’t be moderated is a State decision. Both are wrong.
Texas House Bill 20 prohibits select social media sites from implementing moderation policies which implicate “viewpoint.” Florida Statute 2022, in turn, prohibits sites from banning or moderating content originating from political or journalistic sources. In either case, sites must preserve and present content which might otherwise violate their policies. They must carry speech the government condones, or violate state law.
We are not a partisan organization, and none of our positions are politically based – though we do favor free markets and limited government intervention in developer businesses, and we do encourage policies that provide resources developers can benefit from. More often than not, our role is to help policy makers understand how technology work and how developer businesses operate, or to explain policymakers and policy to the developer community. On rare occasions, we perform these services for the courts, so that lawyers and judges can rule wisely and from a solid technical foundation.
This week, we filed an amicus brief – our third to the U.S. Supreme Court – to ensure that the Justices understand that coding is a creative process; not a mechanical one. Our brief falls atop a deep pile of briefs about the First Amendment and government constraints on private speech. Our purpose is to prevent the potential misinterpretation of what developers do by labeling the algorithms and routines that filter, sort, and manage online content as void of human intelligence and decision making. Because if it takes creativity to craft and code a content moderation algorithm, which it most surely does, then algorithms are as protected as a sonnet or The Federalist papers under the First Amendment. Algorithms are how developers capture and structure the logical processing that a human moderator does so that human decision making can be implemented at scale. And a website’s decisions as to what content they filter and what they promote is “speech”, under the First Amendment. Algorithms, it seems, are people too.
It’s worth taking a moment to talk about what might happen if the Court doesn’t accept our view – if they agree with Texas that algorithms are machines and whatever they do or produce is not creative and never speech. In that extreme case, assuming the Court also agrees that website moderation is protected editorial speech (like newspaper editing), then moderation is only protected if done by hand in real time. Despite the fact that an algorithm might look at the identical inputs and make identical decisions based on learning from human moderators, putting code between human and machine would squeeze creative protection from the act. Moderation at scale disappears, and developers are simply unskilled labor. That simply can’t be.
You can read our brief on the SCOTUS website, and view the briefs of the parties and many other amici here. The case itself will be heard at oral argument in the spring of 2024.