Developers Alliance Response: Privacy Working Group Request For Information

April 7, 2025


Chairman Guthrie and Vice Chairman Joyce,


The Developers Alliance, the leading advocacy organization representing application
developers, the companies they lead, and the industries that depend on them, applauds the
U.S. House Committee on Energy and Commerce Privacy Working Group for requesting
information regarding a national data privacy and security framework.


It has been nearly 25 years since the Federal Trade Commission recommended Congress pass
a national online data privacy law. Since then, 20 states have adopted varying online data
privacy rules, and bills have been introduced in another 16 states. Apps, by their very nature,
function across state lines and attract customers globally. Moreover, data is critically important
in apps’ development, testing, marketing, and growth.


It has never been more important for Congress to pass a single national privacy law. The costs
alone of complying with 50 different privacy laws will crush innovation. However, policymakers
must find common ground that protects consumers’ digital privacy and data without inhibiting
innovation, overregulating data collection and use, or creating new legal risks for app
developers.


In response to the Working Group’s Request for Information, the Developers Alliance has
answered key questions supporting the need for a balanced national data privacy law.

I. Roles and Responsibilities


Data is a foundational aspect of the app ecosystem that cuts across the various roles occupied
by large and small companies. Regulations need to recognize that data plays a critical role in
the app ecosystem, and app developers often straddle the lines of many of these roles.

For example, Apple and Google collect various data points through the App Store and Google
Play, respectively, not just for themselves but also on behalf of developers. Is the store a
controller, a processor, or both? Developers might have little say in what data their store and
platform partners collect, but they benefit tremendously from the insights, metrics, and
dashboards those platforms provide. Similarly, developers collect data directly through their
apps. Their data collection may be integrated with other service providers that add value
through additional analytics and insights, including ones powered by AI.

Privacy regulations must accurately and objectively identify threats to consumer privacy to
protect consumers properly from actual harms in the marketplace and not regulate data
practices that cause no real-world harm. No one’s sensitive data, such as financial or banking
information, social security numbers, or other government identifiers, passwords, and health
information, should be collected, used, or sold without express consent and only for a specific,
disclosed, and agreed-to purpose. It should be noted that the threats from collecting anonymous
or pseudonymous data for advertising, reporting, insights, analytics, and other purposes are
minimal and should be regulated differently than sensitive data.


Regardless of the roles or size of the various actors, privacy laws must not overregulate data
collection and use, or the consequences will be severe for small developers. For example, some
privacy laws try to “carve out” small businesses. Still, lawmakers fail to understand that even the
smallest businesses often collect more data than the law’s minimum threshold and, therefore,
must comply. Many smaller entities, like small app developers with only 100,000 users, rely
heavily on digital partners like Google and Apple for much of their data collection and
processing needs. If privacy laws overly restrict data or conflate the regulation of sensitive
personal data with more basic data, small developers will be the ones most hurt. In this sense,
carve-outs will not spare smaller businesses from the repercussions of a poorly constructed
privacy law.


II. Personal Information, Transparency, and Consumer Rights


As stated above, defining the different tiers and data types is critical. Many states have taken a
thoughtful approach to clarifying different types of data. Virginia, for example, defines personal
data as any information linked or reasonably linkable to an identified or identifiable individual
and does not include de-identified data or publicly available information. “Sensitive data” means
a category of personal data that includes:

  1. Personal data revealing racial or ethnic origin, religious beliefs, mental or physical
    health diagnosis, sexual orientation, or citizenship or immigration status;
  2. The processing of genetic or biometric data for the purpose of uniquely identifying a
    natural person;
  3. The personal data collected from a known child; or
  4. Precise geolocation data.

Generally, data controllers, defined as individuals or entities that determine the purpose and
means of processing personal data, must disclose the purposes for which they collect and
process data, limit the collection of personal data to what is relevant and necessary for the
disclosed purpose, not process data beyond what was disclosed without the consumer’s
consent, not process personal data for targeted advertising or sell personal data without
consent, and not process sensitive data without consent through a carefully crafted, and easily
understood opt-in protocol.

This framework is a tried and true approach to consumer data privacy with significant precedent
in many U.S. states, including California and Europe. It requires controllers to disclose to
consumers the purpose of data collection, limits the purposes to what was disclosed, and
requires opt-ins for certain activities. This allows platforms to collect and process de-identified
data for advertising, analytics, and other business purposes, provides consumers with a way to
block collecting their personal data for targeted advertising purposes, and sets a higher
threshold for the sale of personal data and the collection of sensitive data.


This compares to more radical privacy frameworks, such as the one passed last year in
Maryland. These radical approaches go much further and limit data collection to the minimum
amount of data reasonably necessary to deliver a requested product or service. Maryland’s
law severely limits app platforms’ ability to collect data beyond a user’s device type, operating
system version, and possibly their language. Without this data, advertising, analytics, and other
services would be impossible or prohibitively expensive to offer to app developers.


Regarding enforcement, enforcement should remain with state and federal regulators and be
within the scope of the legislation that is passed. Too often, regulators expand the scope of their
powers beyond the original legislation. In addition, private rights of action create too much
temptation for overzealous trial lawyers to serially file suits against small businesses for non-
material violations or violations that caused no harm. The precedents for frivolous lawsuits from
private rights of action in the Americans with Disabilities Act and serial patent litigators, which
are deeply familiar to add developers, have cost developers millions of dollars in legal fees and
settlements.


III. Existing Privacy Frameworks & Protections


The existing privacy framework in the United States is untenable for developers. Apps are, by
their very nature, national or global businesses. The specter of 50 different privacy laws, with
different requirements, carve-outs, and definitions, represents a compliance nightmare for
developers. The costs of determining which bills they have to comply with and which they don’t
can be crippling.


Moreover, some states are taking more radical approaches to data privacy laws that threaten
the app ecosystem.


Developers need a single, national data privacy law that preempts all state laws and unifies
compliance.


IV. Data Security


Developers take data security incredibly seriously. Data security is a cornerstone of any
successful app in today’s digital age. Privacy laws should require all data stewards to establish,
implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data appropriate to
the volume and nature of the personal data collected.

Such practices should include limiting access to consumer data to only the employees or third-
party service providers that need access to such data to provide or improve a product or
service. Data is only as secure as the entities that hold it, and unnecessary data sharing can
lead to leaks, breaches, or hacks.

This also applies to government-mandated data sharing. Leading platforms like Google and
Apple invest significantly in data security. In recent years, legislative proposals and lawsuits
have tried to require these platforms to share data with competitors. These forced data-sharing
requirements go against common-sense privacy and security standards; the fewer entities that
hold your data, the more secure it is.

V. Artificial Intelligence

Data privacy and security rules should apply equally to AI and non-AI uses. A unified national
data privacy law is just as important to AI developers as to those who do not use AI in their
apps.

Some states, such as Colorado, have passed automated decision-making regulations that
would significantly burden small AI developers. These regulations require companies who
develop or deploy automated decision-making tools to conduct costly impact assessments to
prove that their algorithms are not causing disparate impacts or furthering discrimination.

While no one should ever be unfairly discriminated against, discrimination is already illegal, and
this regulatory approach creates a costly compliance burden for small AI developers.

Similar to privacy laws, America needs a single regulatory framework for AI that takes a risk-
based approach to the most powerful AI models, considers existing regulatory frameworks such
as existing anti-discrimination and financial data control laws such as Gramm-Leach-Bliley, and
takes an industry-specific approach to regulations. This ensures a balanced approach that
protects against misuse of AI without overregulating the technology itself.

VII. Additional Information


In conclusion, developers need a national data privacy law that balances consumer protection
with data’s critically important role in the app and broader digital ecosystem. Radical data
minimization provisions will hurt small developers, as will private rights of action that invite
frivolous lawsuits against small businesses. Lastly, data is critical in the development of AI, and
regulations need to take a risk-based, industry-specific approach to regulating the emerging
technology.

On behalf of the developers and the companies that employ them, the Developers Alliance
thanks Chairman Guthrie and Vice Chairman Joyce for the opportunity to respond to their
Privacy Working Group Request For Information.

Related Content

Developers Alliance Reacts to Blumenthal Announcing Reintroduction of Open App Markets Act

Developers Alliance Reacts to Blumenthal Announcing Reintroduction of Open App Markets Act

Developers Alliance Announces New Advisory Board 

Developers Alliance Announces New Advisory Board 

Developers Alliance Reacts to European Commission’s DMA Investigation Findings

Developers Alliance Reacts to European Commission’s DMA Investigation Findings

Join the Alliance. Protect your interests.

©2025 Developers Alliance All Rights Reserved.