The February 2022 Developers Alliance US Policy Update
The Crisis In Ukraine
Developer hacktivists have taken defending Ukraine into their own hands with hacker collective Anonymous declaring cyberwar against the Russian Government this week. Their efforts have already made major impacts within Russia, including them gaining control of multiple websites for state-run businesses and state-controlled news agencies.
Multiple social media platforms in the US and abroad have ramped up containment of disinformation as pro-Putin propaganda has been (and is expected to continue to be) in overdrive as the conflict intensifies. Companies across the tech space, as well as governments worldwide, are expected to add additional cybersecurity in response to the Russian government upping their cyber attacks. While the Section 230 implications of this month are expected to be discussed for years to come, the immediate concerns for the tech space revolve around safety measures to protect and aid the fleeing and displaced Ukrainians.
Starlink is getting one of its first major public test runs with founder Elon Musk heeding calls via Tweet by Ukrainian Vice Prime Minister Mykhailo Fedorov to activate service for the country in light of the ongoing war.
President Joe Biden announced his pick to replace Justice Stephen Breyer who recently announced his retirement from the Supreme Court. The president nominated Ketanji Brown Jackson, who would be the first African American woman to serve on the nation’s highest court. The Court will weigh in on several tech-focused issues in the coming years, specifically including data privacy, competition, and intellectual property. While Judge Jackson has weighed in on a number of tech cases in the past, the nuances of the past rulings leave it difficult to determine at this point how she may rule on future cases impacting the industry. She faces Senate confirmation in the coming weeks where we look forward to hearing more of her judicial philosophy regarding rulings that may impact developers.
Congress has rapidly increased the number of legislative proposals targeting the digital economy and the app ecosystem. While several are well-intentioned, many of the bills demonstrate a fundamental lack of understanding of how the ecosystem works and how it could be harmed through indiscriminate changes.
The House Energy & Commerce Committee held a hearing entitled “Holding Big Tech Accountable: Legislation to Protect Online Users” addressing a series of bills aimed at policing tech companies. The discussion included the following bills:
H.R. 6416, the “Banning Surveillance Advertising Act of 2022”
Reps. Anna Eshoo (D-CA) and Jan Schakowsky (D-IL) introduced the “Banning Surveillance Advertising Act of 2022”. The bill seeks to “(prohibit) advertising networks and facilitators from using personal data to target advertisements, with the exception of broad location targeting to a recognized place, such as a municipality. The bill also prohibits advertisers from targeting ads based on protected class information, such as race, gender, and religion, and personal data purchased from data brokers.” While the bill allows apps to use contextual advertising methods, it poses significant concerns as to how and to what extent developers would be able to monetize their apps.
H.R. 6580, the “Algorithmic Accountability Act of 2022”
Rep. Yvette Clarke (D-NY) in conjunction with Sens. Ron Wyden (D-OR) and Cory Booker (D-NJ) introduced the Algorithmic Accountability Act of 2022. The legislation was proposed in an effort to require transparency and accountability for oversight of software, algorithms, and other automated systems. Proposed out of a concern as to how marginalized groups may be impacted by algorithms, the legislation “requires companies to conduct impact assessments for bias, effectiveness and other factors, when using automated decision systems to make critical decisions. It also creates…a public repository at the FTC of these systems, and adds 75 staff to the commission to enforce the law.” Developers Alliance agrees that poorly crafted algorithms can cause harm and that there should be standards and principles for the community, however, Congress mandating how developers write their code leaves much room for even greater problems and red tape that negatively impacts developer-led companies and could result in even more harm to consumers.
H.R. 6755, the “Cooperation Among Police, Tech, and Users to Resist Exploitation Act”
Rep. Gus Bilrakis (R-FL) introduced the “Cooperation Among Police, Tech, and Users to Resist Exploitation Act” or “CAPTURE Act” to require reporting on, and assessment of, how social media companies currently communicate, consult, and coordinate with law enforcement to address illegal content and activity online and to reduce harms faced on platforms.
H.R. 6786, the “Increasing Consumers’ Education on Law Enforcement Resources Act”
Rep. Markwayne Mullin (R-OK) introduced the “Increasing Consumers’ Education on Law Enforcement Resources Act”. The legislation would require that the Federal Trade Commission conduct an education campaign to inform the public about the resources available when their safety and security have been violated online.
H.R. 6796, the “Digital Services Oversight and Safety Act of 2022
Reps. Trahan, Schiff & Casten introduced the Digital Services Oversight and Safety Act, which intends to serve as “comprehensive transparency legislation” guaranteeing the government “the authority and resources necessary to hold powerful online companies accountable for the promises they make to users, parents, advertisers, and enforcers.” The legislation seeks to do this by establishing a Bureau of Digital Services Oversight and Safety at the Federal Trade Commission (FTC).
On the Senate side, Sens. Blumenthal (D-CT) and Blackburn (R-TN) introduced the Kids Online Safety Act. The bill, which was drafted as a response to whistleblower Frances Haugen’s testimony, will:
“Require that social media platforms provide minors with options to protect their information, disable addictive product features, and opt-out of algorithmic recommendations. Platforms would be required to enable the strongest settings by default.
Give parents new controls to help support their children and identify harmful behaviors, and provides parents and children with a dedicated channel to report harm to kids to the platform.
Create a responsibility for social media platforms to prevent and mitigate harms to minors, such as promotion of self-harm, suicide, eating disorders, substance abuse, sexual exploitation, and unlawful products for minors (e.g. gambling and alcohol).
Require social media platforms to perform an annual independent audit that assesses the risks to minors, their compliance with this legislation, and whether the platform is taking meaningful steps to prevent those harms.
Provide academic and public interest organizations with access to critical datasets from social media platforms to foster research regarding harms to the safety and well-being of minors.”
The EARN IT Act of 2022 (S.3538) and Open App Markets Act (S. 2710) moved forward out of the Senate Judiciary Committee this month. The EARN IT Act revises Section 230 of the Communications Decency Act, the law which shields developers from liability for third-party content. If enacted, EARN IT would allow civil and criminal lawsuits against internet companies that advertise, promote, present, distribute or solicit any material depicting the sexual abuse of children. While the most recent version of EARN IT allows for companies to have end-to-end encryption for protecting the privacy and security of content, the bill would allow for information about how and whether companies offer encryption to be used as evidence against developer-led companies in court, thus incentivizing platforms to avoid encrypted products due to liability. Further Developers Alliance commentary on the bill can be found here.
The Open App Markets act sets prohibitions on app store use and payments. While being sold to other elected officials as a bill meant to promote competition within the app ecosystem the nondiscrimination provisions in the proposed legislation, in reality, could destroy the third-party app model. You can read more about Developers Alliance’s take on the Open App Markets Act here.
At the state level, versions of the Open App Markets Act are seen popping up in a variety of jurisdictions, including Illinois, Arizona, Minnesota, and Rhode Island. While the Open App Markets App would preempt any state legislation, Developers Alliance will remain active in fighting harmful legislation to developers at both the state and federal levels.
Senators Cynthia Lummis (R-WY) and Amy Klobuchar (D-MN) introduced the Nudging Users to Drive Good Experiences on Social Media (or “NUDGE Act”), a bill designed to “establish studies to examine and recommend interventions to reduce addiction and the amplification of harmful content on social media platforms”. The legislation would direct the National Science Foundation to conduct a study to identify content-neutral interventions (ex: asking users if they want to read an article before sharing it) in an effort to reduce social media addiction and the spread of harmful content. These studies would then be used by the FTC to conduct a rulemaking on how to apply the findings to platforms, and be the basis of new legislation against platforms that are deemed to be using unfair or deceptive practices.