The May 2022 EU & UK Policy Update.
The European Commission Proposes Controversial CSAM Regulation
The European Commission has proposed a regulation laying down rules to prevent and combat child sexual abuse. The proposed rules include obligations for providers of hosting of interpersonal communication services as follows:
-
Mandatory risk assessment and risk mitigation measures against dissemination of child sexual abuse material (CSAM) or for the solicitation of children (grooming).
-
Targeted detection obligations, based on a detection order issued by a court or an independent national authority
-
Content detection using indicators of child sexual abuse verified and provided by a specialized EU Centre which will be set up based on the regulation. “Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible”. One of the recitals provides the following explanation with regard to compliance with the obligations: “In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivizing the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children.”
-
Removal of content following removal orders, including disabling “access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.”
-
App stores will need to ensure that children cannot download apps that may expose them to grooming
-
Reporting obligations
The first reactions to the proposal were extremely critical. Germany’s federal digital minister issued a press release stating that “the general control of chat histories and bypassing encryption go too far”. Will Carhart, Head of WhatsApp, tweeted about his disappointment “to see a proposed EU regulation on the internet fail to protect end-to-end encryption”. Privacy activists have also called against the high risks of mandatory chat control, considering that the law would require all providers of email, messenger and chat services to engage in general monitoring and would undermine secure end-to-end encryption.
The proposal was published together with a new EU Strategy for a better internet for kids (BIK+), and a Compendium of EU formal texts concerning children in the digital world.The European Parliament and the Council will have to agree on the regulation. Stakeholders are invited to provide their feedback in a consultation open until Jul 26, 2022.
EU Ramps Up Diplomacy with the US
A European Parliament delegation has visited Silicon Valley to meet with tech companies, local authorities and academia. MEPs from the Internal Market and Consumer Protection Committee (IMCO) delegation traveled to the U.S. from 23 to 27 May, and met representatives of Google, Meta, Apple, Airbnb, eBay, Paypal, Uber, Salesforce, the non-profit Electronic Frontier Foundation, Cloudflare, AT&T, Stanford University’s Center for Internet and Society, game developers, HP Inc and Argo AI. The EU delegation discussed the latest regulatory developments, including the recently agreed EU rules on online platforms – the Digital Services Act (DSA) and Digital Markets Act (DMA). The MEPs told the US companies that “EU’s comprehensive re-structuring of the digital economy was sorely needed” and expressed the hope that it “will set an example and create a demand in other jurisdictions worldwide.”
The EU will soon set up an office in Silicon Valley, as confirmed during a hearing in the IMCO Committee by the future Commission’s representative, Gerard de Graaf (currently Director in DG CONNECT, overseeing the main digital policies).
Another European Parliament delegation has visited Washington, DC, to discuss issues related to data protection, AI and security. The Civil Liberties Committee delegation held meetings with representatives of the U.S. Congress and U.S. Administration, i.e. the Departments of State, Justice, Homeland Security and Commerce, the Federal Trade Commission (FTC), the Federal Bureau of Investigation (FBI), and the Privacy and Civil Liberties Oversight Board (PCLOB). They also held meetings with various think-tank and NGO representatives.
The Trade and Technology Council (TTC) has held its second ministerial meeting in Saclay, France. With regard to the digital regulation, the EU and U.S. agreed to strengthen cooperation on key aspects of platform governance and to develop, as well:
-
a Strategic Standardization Information (SSI) mechanism to promote and defend common interests in international standardization activities. Both sides will work to foster the development of aligned and interoperable te
chnical standards in areas of shared strategic interest such as AI, additive manufacturing, recycling of materials, or Internet of Things
-
a joint roadmap on evaluation and measurement tools for trustworthy AI and risk management
One of the thematic TTC working groups published a Cybersecurity High-Level Guidelines for SMEs.
Data protection, privacy & security
The UK Government is asking feedback from the tech industry on a voluntary Code of Practice for app store operators and developers. The code would be the first such measure in the world and would set out baseline security and privacy requirements for developers and operators of app stores for smartphones, game consoles, TVs and other smart devices.
The UK’s National Cyber Security Centre (NCSC) has published a Threat Report on application stores, on the risks for users “because of fraudulent apps containing malicious malware created by cyber criminals or poorly developed apps which can be compromised by hackers exploiting weaknesses in software”. The proposal for a code of conduct also follows a government review into the app store ecosystem from December 2020 to March 2022, “which found some developers are not following best practice in developing apps, while well-known app stores do not share clear security requirements with developers”.
The Spanish Data Protection Agency (AEPD) has fined Google €10 million for wrongfully transferring user data to third parties and hindering users’ content-removal rights. The AEPD found that Google, while sending data to Lumen (a database that collects requests to remove material from the web), has not given users the option to opt out of sharing their data with Lumen, therefore not receiving the appropriate consent before transferring the data. The data included requests made by citizens, their identifications, email addresses, the alleged reasons for the requests and the URLs claimed.
Member of the European Parliament and digital freedom fighter Patrick Breyer (Pirate Party) filed an action for an injunction against Meta at a German districtional court. As a user of “Facebook Messenger”, Breyer is suing “against the suspicionless automated search of private chat histories and photos”. He also noted in the announcement “that while the automated searches of personal messages and chats is so far only practiced by major US providers, the EU Commission is to propose tomorrow to make this mandatory for all providers of e-mail, messenger and chat services,” with reference to the proposal for a regulation on combating CSAM.
A research service within the French Government has published a report on age verification. It notes that “virtually no online service has a satisfactory process for verifying the age of its users,” and that, despite the variety, few methods are both easy to implement, not very restrictive and respectful of user privacy, efficient and robust in the face of fraud attempts. The report proposes the development of “an experimental solution for transmission of proof of age by 3rd parties and interoperable with several verification methods.”
Consumer protection
The European Commission has published a report on dark patterns (“Behavioural study on unfair commercial practices in the digital environment: Dark patterns and manipulative personalisation”). According to the research conducted for this study, dark patterns are prevalent and increasingly used by traders of all sizes, not only large platforms. A mystery shopping exercise has revealed that 97% of the most popular websites and apps used by EU consumers deployed at least one dark pattern and the most prevalent were hidden information/false hierarchy, preselection, nagging, difficult cancellations, and forced registration. The prevalence of dark patterns varies between different types of websites and apps (e.g. countdown timers or limited time messages are quite prevalent on e-commerce platforms, while the use of nagging is more customary in health and fitness websites/apps).
The Norwegian Consumer Council (NCC), together with more than 20 consumer groups from 18 European countries have launched a coordinated action asking authorities to ban loot boxes in games. The consumer groups call on the national and EU authorities to prioritize regulatory investigations and interventions, as “despite being a major industry, the video game sector has largely evaded regulatory scrutiny.” They suggest a number of measures, including a ban on deceptive design, extra protections for minors, and transactional transparency. NCC has published the report “Insert Coin: How the gaming industry exploits consumers using loot boxes,” which presents problematic practices, such as exploiting cognitive biases and vulnerabilities through deceptive design and marketing, using layers of virtual currencies to mask or distort real-world monetary costs and targeting loot boxes and manipulative practices towards minors.
The European Commission has started a “digital fairness fitness check” on consumer law. It is assessing whether additional measures are needed “to ensure an equal level of fairness online and offline.” The evaluation targets the following pieces of legislation: the Unfair Commercial Practices Directive 2005/29/EC, the Consumer Rights Directive 2011/83/EU and the Unfair Contract Terms Directive 93/13/EEC. Stakeholders can provide feedback until Jun 14, 2022.
Competition
The UK Competition and Markets Authority (CMA) has opened an investigation over Google’s potential abuse of dominance in the ad tech market. This is the second investigation into Google’s ad tech activities, following a launch of a probe into Google and Meta’s ‘Jedi Blue’ agreement in March. The European Commission launched a similar investigation in June last year. The CMA is also monitoring compliance with commitments Google made in relation to its Privacy Sandbox proposals to remove third-party cookies and other functionality from Google’s Chrome browser.
The German Competition Authority (Bundeskartellamt) has designated Meta as a company with “paramount significance for competition across markets”. The decision is based on the updated competition law (Section 19a of the German Competition Act) and enables the Bundeskartellamt to investigate and prohibit practices of large digital companies in a swift procedure
. Alphabet/Google was the first company designated with such status at the beginning of this year.
The Dutch Authority for Consumers and Markets (ACM) has started to draw guidelines for the application of the Platform-to-Business Regulation (P2B Regulation), following the conclusions of a market study. ACM aims to publish the guidelines this fall for public consultation. These will provide details on how ACM, as the regulator enforcing the P2B Regulation, will interpret what the platforms need to do, and will ensure ”that their business customers know what their rights are.”
Content regulation
Developers Alliance has joined other associations in Brussels representing a wide range of sectors impacted by the Digital Services Act (DSA) in expressing concerns about the reappearance of a ‘stay-down’ provision in the latest compromise text. Initially aimed at clarifying the concept of a ban on a general monitoring obligation, the provisions can be interpreted as imposing the constant monitoring of online activities. This will have significant consequences not only on companies and online ecosystems, but on society at large. The changes were made during technical drafting, after the political agreement reached last month. The final text of the DSA is yet to be submitted for formal adoption.
The French regulator for audiovisual and digital communications (ARCOM) has opened a public consultation on access to data from online platforms for research purposes (also available in English). The results are expected to contribute to a framework for accessing data from online platforms for research purposes and in relation to the issues over which the ARCOM has jurisdiction: combating information manipulation and online hate. The consultation is also targeting the implementation of the relevant rules of the DSA.
Artificial intelligence
The UK Office for Product and Standards has published a study on the impact of Artificial Intelligence on product safety. Developers Alliance was one of the stakeholders invited to contribute to this important research which will inform policymakers’ work on the implementation of UK’s National AI Strategy.
AI developers operating in the EU market and interested to see how the upcoming EU’s AI Act might affect their work, should check Meta’s Open Loop program. This policy prototyping program invites startups to test first-hand parts of the EU AI Act, helping to assess its clarity and technical feasibility. Participating companies will get to:
-
learn about policy experimentation by implementing it in practice;
-
influence and shape the policy debate, making a concrete impact on the regulation;
-
attend exclusive guest talks and Q&A sessions with Meta’s PyTorch and AI Research teams on the latest AI/ML developments and innovations;
-
become part of the vibrant Open Loop Community, with startups from all over the world;
-
play an active role in workshops and have your contribution acknowledged publicly in the reports.
On top of that, selected companies that will take part in a second phase of the program will receive ads credits on Meta’s platforms.
Miscellaneous
Belgium intends to implement a tax shelter for the gaming industry, as soon as next year, as De Tijd reports. The measure is a response to the need for more investment in this sector and will basically be an extension to the current support scheme for the audiovisual sector. The expansion into games aims to boost the production of ‘video games with a cultural dimension’ and it was enacted in 2019, but the European Commission expressed objections. To respond to the Commission’s concerns, the law will require companies to demonstrate that more than 50% of their revenues result from production and operation of video games and they fulfill the cultural dimension criteria, as verified by the competent authority (so-called “cultural test”). The business revenues are not limited to the Belgian territory, as per Commission’s observations.
Apple announced that more than 300 French developers have participated in the App Store Foundations Program. The program, launched 4 years ago, offers selected developers additional support and training to develop better apps, including a tailor-made professional development plan. Since the beginning of this year Apple has started to extend the program in Europe, targeting 29 countries.