“Encryption With A Back Door” And Other Oxymorons

Encryption has garnered a lot of attention lately on Capitol Hill — not all of it is welcome.  


Untitled design 2.png

Developers and members of Congress have something in common. We both recognize that there is quite a lot of unsavory content on the internet, and we want to see children protected from exploitation online. We both want those that promote bad content to be held accountable and face legal consequences. What developers largely disagree with, however, is threatening encryption to make that possible.

At the Developers Alliance, we have long stated that encrypted systems mean safer systems. Chances are that you, reader, have been a victim of at least one major cyber breach in the last few years. Yahoo!? Equifax? Target? Take your pick. Congress has called for measures that ensure companies that have data store it adequately and vigorously protect it. That protection is done by encryption. We can not say we want stronger systems made up of more secure, cutting-edge technologies to prevent cyber breaches and attacks while simultaneously calling for the end of encryption.

‘Encryption with a back door’ just means weak encryption. Weak encryption can — and will — be hacked. Especially when criminals know that companies are required to have that back door accessible. Weak encryption does and will put millions of people at risk for cyber attacks. Once one country demands a back door, so will every other, creating a ring of keys allotted by jurisdiction. Even the regulation-hungry European Union has called end-to-end encryption a “necessity.”

“But, but, it’s just law enforcement who needs the key!” we hear the detractors of encryption say as they read this article. Let’s set current debates on law enforcement and possible reforms aside. If TechCompany X is required to have weak encryption (aka ‘a back door for law enforcement’), that mandated technology will have a major impact, contrary to assertions made by proponents of that technology. All of their products or services will be affected, across all jurisdictions they operate in, with no exceptions. Hackers in a foreign country, or state actors trying to cause trouble, will be able to break these systems — our systems, with greater ease. Congress has essentially asked companies to simultaneously make their products more secure and more susceptible to cyberattacks and surveillance at the same time. Putting the aforementioned consumer safety aside (and we’re all consumers of these products), we have also received yet another instance of regulatory confusion when it comes to digital goods.

The biggest threat to encryption in the U.S. right now is the EARN IT Act. The bill seeks to create a back door for law enforcement while leaving technology companies liable for any CSAM that is shared on their platforms. Unsurprisingly, the tech community as a whole has largely expressed their discontent with the bill’s objectives regarding mandates on encrypted systems.

An alternative proposal to solve the same problems is the Invest in Child Safety Act. This bill takes an aggressive stance on stamping out online predators & CSAM violations in their tracks while not attacking encryption, including  “require(ing) tech companies to increase the time that they hold evidence of CSAM, in a secure database, to enable law enforcement agencies to prosecute older cases.” In promoting the alternative bill, sponsor Sen. Wyden additionally stated that “If you weaken strong encryption, all that filth would just move to dark web platforms and you’d make it easier for really bad guys to harm children.”  

Both the EARN IT Act and the Invest in Child Safety Act have been supported by child advocate groups, indicating that they believe both bills would help address the problem. Children deserve protection from heinous crimes and platforms are prepared to do all they can to ensure this conduct does not happen on their watch. Is it not better to proactively enlist the tech companies for help in solving the problem, however? After all, the people that deal with encryption for a living are probably far better equipped than Congress to understand what’s possible. 

Tech companies and developers want to help solve these problems. They want people to feel safe on their platforms and have faith in their business practices. Help us help you — and the children — by listening when we say that harming encryption will only cause further security problems for all, and lead to backward steps in policing harmful content.

Avatar photo

By Sarah Richard

Developers Alliance Policy Counsel & Head of US Policy

Related Content

End of the Road

End of the Road

Digital Markets Act: Unlocking the Potential of Interoperability for Developers and Consumers Recording

Digital Markets Act: Unlocking the Potential of Interoperability for Developers and Consumers Recording

The Next Netflix? National Security? Standardized Test Scores? Infrastructure Investment Is The Answer

The Next Netflix? National Security? Standardized Test Scores? Infrastructure Investment Is The Answer

Join the Alliance. Protect your interests.

©2020 Developers Alliance All Rights Reserved.