Devs work hard to make the internet a safer place. They have long operated under the premise that consumer trust is paramount, since users won’t pay for services they distrust. This mindset protects users, helping to create a more robust internet where devs can design and deploy new products and services. These products and services make our lives healthier, more convenient, and more prosperous. Make no mistake, the entire app ecosystem believes those behind child exploitation should be prosecuted to the fullest extent of the law. The ecosystem also believes that efforts by Congress to curb this exploitation should target the perpetrators. And that’s why the Senate Judiciary Committee’s consideration of the EARN IT Act and STOP CSAM Act is so troubling.
The EARN IT Act would put much of the onus on platforms to curtail child exploitation. The bill would create the National Commission on Online Child Sexual Exploitation Prevention, which would establish voluntary best practices for platforms to prevent, reduce and respond to exploitation. This is a positive step. However, it would also chip away at the privacy and security features that we’ve all become so accustomed to. In short, the bill would seek to weaken encryption for a “good guys only” access point, which is simply not technically possible. It would also amend Section 230 of the Communications Decency Act by removing liability protections for platforms. Removing these liability protections would create an internet wild west where hate speech, misinformation, and other harmful content run rampant, while devs would be forced to serve as third party content moderators or accept the risk that hosting user content creates. Devs aren’t equipped, nor did they sign up, to be content moderation police.
Similarly, the STOP CSAM Act would leverage the app ecosystem in an attempt to solve a larger social problem. This legislation would allow victims to bring cases against platforms and app stores that failed to prevent child exploitation. This presumes that the platform or app store somehow facilitated these horrendous acts. Nothing could be further from the truth. Devs and the companies that employ them are actively trying to end exploitation to ensure the internet can be a safe place for all users, and have been identifying and removing this content for years. Enabling such litigation will only hurt further efforts to stem exploitation as devs and their teams turn their attention to the courts instead of protecting users. Punishing those that are helping isn’t only unfair: it will make it even harder to tackle this critically important issue.
If the Senate Judiciary Committee passes these bills they will head for the Senate floor for consideration. Taken together, they place a significant burden on the app ecosystem that will hamper our industry’s ability to protect children. Users have rightly called for enhanced security features to ensure their identities, online habits, personal information, and more remain safe and secure. Devs have responded by employing end-to-end encryption in their products to protect users and maintain consumer trust. The government can’t have it both ways – it can’t prod devs to make sure their products are secure while also demanding an open back door. These concerns, along with removing Section 230 liability safeguards – the bedrock of the internet – will have serious consequences for the internet and the users who depend on it.
Child exploitation is an abhorrent crime and Congress is right to take it on. However, our nation’s lawmakers must focus on the perpetrators of the crime, not the internet and devs who power it. Making all users less safe is not an acceptable approach to solve such an important societal issue. We urge Senators to continue to consider the consequences of both bills before passing them.