What’s Next? Congress Holds Hearing on Companies’ Use of Consumer Data

When there’s gridlock in Washington, like we’ve seen for most of 2017, Congress reacts by holding hearings both to appear busy and to lay the foundation on policy priorities they’ll act upon once movement resumes. Wednesday, the House Energy and Commerce Committee held a hearing focused on data, “Algorithms: How Companies’ Decisions about Data and Content Impact Consumers.” Data drives the digital economy and our members use it daily to reach and serve users. Changes to how it’s regulated will impact developers across the globe, how they build apps, work with platforms, and collect and use consumer information. If you’re a developer or rely on them to operate, you should be on alert and at the ready to ensure your ability to conduct business and innovate isn’t significantly hindered when Congress inevitably takes action on data.  

The witnesses at Wednesday’s hearing included six diverse academics with backgrounds in privacy and technology and computer and information sciences. Although recent events like the Equifax breach and net neutrality repeal were peppered throughout the discussion, the essence of the hearing focused on consumer data — how it’s collected and used, how companies utilize algorithms, how the ecosystem is regulated, and societal expectation. A good portion of the discussion was clarifying what content is collected on ISPs (i.e. Comcast, AT&T) and edge providers (i.e. Google, Facebook), but a few other topics stuck out as ones we’ll likely hear more about from Congress in the near future. 

The witnesses shared their insights and research experience in balancing regulations with innovations. Dr. Tucker of the MIT Sloane School of Management, who has research experience in the EU, warned against too restrictive regulations and their negative impact on innovation as she’s seen in Europe. She also recommended giving a sense of control back to consumers to help protect their privacy without impeding entrepreneurship. Both Mr. Pasquale, a law professor at the University of Maryland, and Ms. Klonick from Yale Law School warned that laws curtailing self-regulation are not in the public’s best interest, and that companies are capable of self-regulation and an outside eye is necessary only to make sure they’re in compliance and to curtail abuse. 

Several committee members inquired about the Federal Trade Commission (FTC) and its role enforcing bad actors and serving as a resource for small businesses and consumers trying to compete with large entities. Ms. Moy from the Georgetown Law Center on Privacy and Technology advised that although agencies can respond to enforcement faster than Congress, policymakers should refrain from addressing complex challenges, like data and privacy, with a one-size-fits-all approach. 

Dr. Kearns from the University of Pennsylvania touched on the role machine learning and artificial intelligence (AI) are playing in data analysis, but not without human oversight. He indicated AI researchers are finding ways to address discriminatory behaviors by algorithms. Yet, the scale of data collected is massive and regulatory and human monitoring will struggle to keep pace. He warned that it’s impossible to audit algorithms without compromising the nature of them, so any policy needs to balance the companies’ IP while addressing privacy and consumer concerns. 

In terms of current privacy protocols, Dr. Shahar of the University of Chicago Law School made the argument that users are not keen to read the lengthy privacy agreements we’ve all seen although they’re mandated by law. Representative Schakowsky (D-IL) added that as a consumer, she often doesn’t read the privacy terms and if you decline them, you can’t reach your end goal on the platform — a sentiment most people share. In terms of monitoring content, Ms. Klonick indicated that platforms follow a global set of rules that include human content moderators in addition to algorithms placing posts in feeds and reviewing content for abuse. Mr. Pasquale added that it’s difficult for anyone to know what takes place behind the scenes of a platform, and Congress can make this more transparent by requiring openness of algorithms and how consumer data is used. 

Needless to say, consumer data is a complex issue that impacts everyone. It’s promising to see Congress making efforts to understand its nuances — what consumers share and where, which information is encrypted, how companies use it, steps taken to protect individual privacy, where policy falls short or overreaches, and how to keep pace with fast-moving technology. It’s the Alliance’s charge to help those outside of the developer world understand data and its many complexities. We look forward to working with our members and policymakers on this crucial issue, and advise those in the developer world to “get smart” on your data practices in preparation that Congress will likely be knocking on our community’s door very soon. 



Rachel Headshot (1).png

RACHEL EMEIS
DIRECTOR, US INNOVATORS POLICY COUNCIL

Avatar photo

By Rachel Emeis

Contributing Author & Director, US Innovators Policy Council

Related Content

Developers Alliance files Amicus Brief to Argue that Algorithms are Protected by the First Amendment

Developers Alliance files Amicus Brief to Argue that Algorithms are Protected by the First Amendment

With Holidays Approaching, Outlook for Dev Policies is Limited on Capitol Hill

With Holidays Approaching, Outlook for Dev Policies is Limited on Capitol Hill

Alliance Files in the U.S. Supreme Court. Again.

Alliance Files in the U.S. Supreme Court. Again.

Join the Alliance. Protect your interests.

©2017 Developers Alliance All Rights Reserved.