Not Every Social Problem has a Tech Solution

Tech can accomplish many things, but it can’t do that.

Facts first. It is impossible to have both secure communications and a back door for law enforcement. Not “hard” or “eventually” but impossible – full stop. Anyone that says anything different is lying, or is naive. Perhaps both. Second, using technology to monitor communications is surveillance, and there is no way for the tech to do it without simultaneously enabling the people who control the tech to do it. Again, this is binary: surveillance or no surveillance. Putting technology in the mix changes nothing. Finally, to verifiably authenticate someone online, someone else must be able and willing to back that authentication. For any meaningful system under government oversight, that chain of third someones will always end with the government. There is no magical technology that can verify you or authenticate your age or credentials without a supporting chain of authentication that leads to a trusted third party.

Across the world – in the United States, Europe and the United Kingdom, Australia, and elsewhere – laws are being written and implemented that mandate technology companies, their staff, and software developers, to secretly surveil their customers on the government’s behalf. This is most frequently justified in the battle against child sexual abuse material (CSAM), something ethical societies universally abhor. As a developer advocacy organization it’s not our role to take sides on broad social issues that don’t uniquely impact the community we represent. Balancing the loss of consumer privacy and security against improvements in online CSAM detection isn’t a developer issue but an all-of-society issue, and everyone should be free to weigh that balance for themselves. But when the argument is distorted by deliberately mischaracterizing what is technically possible, we need to speak up.

There is not – and never will be – a technology that can inspect and detect CSAM in online systems while also ensuring user privacy and data security. In end-to-end encryption (E2EE), the sender has a key, and the receiver has a key. If anyone else has a key there can never be confidence that the message is authentic or secure. It’s as simple as that.

Likewise, replacing a name and a face with a nameless and faceless technology or corporate logo doesn’t make surveillance any less intrusive – or more accurate. Software is written by people, and it implements the logic people have chosen. Even AI is defined by the training data that teaches it – made available or selected or created by people. There is no real difference between a system that surveils for CSAM and one that surveils for women’s health violations, gender identity signals, attitudes towards firearms, or support for the communist party. A database of hashes that represent known CSAM could just as easily be hashes that identify anti-government propaganda. People control the databases, and if security is simultaneously weakened by a loss of secure encryption chaos will follow.

Finally, protecting children through age verification requires de facto verification of the age and identity of ALL users. The root of that verification rests with some trusted third party willing to absolutely vouch for the authenticity of whatever credential anchors the system. Just as human experts cannot pinpoint someone’s birth date by observing them, sampling their DNA or imaging their resonating molecules, online systems cannot reliably determine someone’s actual age. Good regulation must acknowledge this, and rather than wave a technology wand over the problem, laws should reference robust and repeatable scientific tests that exist today and make allowances for error – or better still accept that online anonymity and online child age verification are tradeoffs to be balanced.

A sufficiently advanced technology may be indistinguishable from magic, but that doesn’t make magic a good basis for a legal system. If elected officials don’t trust their constituents enough to honestly present the tradeoffs involved in online child protection, they insult the people who elected them. These issues are important, and we need solutions that are both workable and specific. In the meantime, we will endeavor to correct the record whenever technology and magic are used interchangeably.

Avatar photo

By Bruce Gustafson

Bruce is the President and CEO of the Developers Alliance, the leading advocate for the global developer workforce and the companies that depend on them. Bruce is also the founder of the Loquitur Group, a DC consulting firm, and the former VP and head of the DC Policy office of Ericsson, a global information and communications technology company, focusing on IPR, privacy, IoT, spectrum, cybersecurity and the impact of technology and the digital economy. He has previously held senior leadership positions in marketing and communications at both Ericsson and Nortel, as well as senior roles in strategy and product management across wireless, optical and enterprise communication product portfolios.

Related Content

Developers Alliance files Amicus Brief to Argue that Algorithms are Protected by the First Amendment

Developers Alliance files Amicus Brief to Argue that Algorithms are Protected by the First Amendment

A Busy Regulatory End of the Year in Europe 

A Busy Regulatory End of the Year in Europe 

Alliance Files in the U.S. Supreme Court. Again.

Alliance Files in the U.S. Supreme Court. Again.

Join the Alliance. Protect your interests.

©2023 Developers Alliance All Rights Reserved.