Position On The Opinion Of The German Data Ethics Commission (DEK)

Position On The Opinion Of The German Data Ethics Commission (DEK)

# (as yet) this position is based on the Executive Summary (English translation)
# it focuses mainly on the approach and those proposals which are targeting an EU-level intervention

How should we look at AI? Is the perspective of the world as either human or technology fit, when AI is a little of both?

All the DEK’s recommendations and arguments start from this credo, inspired by the German Constitution: “technology should serve humans rather than humans being subservient to technology can be taken as incontrovertible fact.” 

The principle is perfectly fit for the current stage of technological development, including machine learning and specialized AI. It remains to be assessed whether its appropriateness changes when we’ll get closer to generalized AI. The proposals don’t appear to be intended as future-proof.

As many scientists agree and numerous studies show, AI is different from human intelligence; it’s not supposed to replace human intelligence, but to augment it, and in a considerable way. Considering the exceptional economic and social benefits of AI, humans should embrace it. In the future both types of intelligence could co-exist and work in a collaborative way, without imposing an (artificial) subordination. 

General Remarks  

Our analysis will follow the structure of the Executive Summary. 
Firstly, here are some general remarks on its main premises.

A complicated legal approach. Data as separate and independent of AI.

The approach is mainly legalistic, as DEK’s mission is to provide ethical and legal guidelines. The DEK Opinion takes two perspectives: data and algorithmic systems (in a broad sense). It seems that the intention is to address different issues. Thus, the two separate approaches could be seen as two streams of regulatory actions, adding layers of laws and different sorts of rules, affecting in practice the same products and services

A horizontal scope – not limited to AI, but covering all software.

DEK is considering that “AI is merely one among many possible variants of an algorithmic system, and has much in common with other such systems in terms of the ethical and legal questions it raises.” According to the description in the beginning of Section 3, algorithmic decision systems are envisaged, but this doesn’t exclude the recommendations (especially the regulatory ones) being applied in a general manner. 

Also, it can be assumed that the recommendations cover the whole software spectrum, without making distinctions between the specifics of different types of software. 

Innovation vs. Precautionary principle.

The DEK approach is also clearly oriented toward the precautionary principle, even though we can find in the text the following affirmation, which translates the innovation-principle:  “(…)regulation must not unduly inhibit technological and social innovation and dynamic market growth. Overly rigid laws that attempt to regulate every last detail of a situation may place a stranglehold on progress and increase red tape (…)”

The concrete recommendations are in contradiction with the above-mentioned vision. 

A risk-based approach is salutary, but precaution should go hand in hand with innovation, one doesn’t exclude the other. Thus technological progress will be guaranteed, together with a proper protection of citizens, society and economy. 

Comprehensive framework vs. adapting existing laws.

Many aims could be achieved through various ways, not only by a centralized, horizontal and prescriptive regulatory approach. Working together with the industry is essential, especially when it comes to advanced technological solutions, such as AI. It could prove difficult to ensure a normative behaviour and to attain the proposed objectives, when the rules are not fit and proportionate. 

Although a “synergistic use of various governance instruments at different levels (multi-level governance)” is seen as “vital in view of the complexity and dynamism of data ecosystems”, this approach is not reflected by the concrete recommendations.

In a scenario where a comprehensive legal framework will be adopted (including at EU level), following these recommendations, it is unclear how this will provide the proper environment for European businesses to develop new technologies, especially AI. Setting a prohibitiative environment will lead to a situation where European citizens will be deprived of the benefits of technological progress. Before initiating any legislative proposal, German and EU policymakers should focus on in-depth impact assessments.  

A protectionist approach.

One of the objectives, “strengthening the digital sovereignty of both Germany and Europe”, which is consistently followed throughout the report, indicate the protectionist nature of the proposed measures. Due to the global nature of technology businesses, this obviously represent a flawed strategy. 

Protection of democratic values, rights and freedoms could be ensured through an “updated” approach which embraces technology and thus is fit for the dawn of the age of artificial intelligence. Progress is entirely transforming society, in all its aspects. Technology is the force behind and the tool that enhance humans. A narrow and protectionist vision will slow down EU’s progress.

Main Problematic Aspects Of The Section On Data

The scope of definition.

Even though the DEK Opinion has a legalistic approach, it doesn’t provide a definition of the notion of ‘data’, which the recommendations refer to. This is important, as certain legal amendments that are suggested could have significant impact on businesses. 

Determining the area of application of the proposed measures is essential, especially when it comes to non-personal data. Certain recommendations on access to non-personal data, for consumers, but also for companies, are problematic, not only from this perspective, but also when it comes to place these new rights and the corresponding obligations within the current framework of civil and commercial law.

The absence of such a definition is leading to numerous question marks regarding certain proposed measures. 

For example, recommendation no. 28 is referring to “the data accumulated in existing value creation systems”. The scope of this recommendation is to promote model contracts for sharing data. Without a proper delineation between personal and non-personal data, data and information, sensitive and non-sensitive data, it is impossible to establish the appropriate legal requirements and limitations applicable to such commercial transactions (such as trade secrets).  

It doesn’t seem that the perishable character of data was considered by DEK’s analysis. This important aspect is missing from the proposed general standards for data governance and is a major flaw in the argumentation. Taking into consideration this characteristic of data changes the perspective on the importance of data in certain situations.

Ownership vs. Rights.

Recommendation no. 5 is highly welcomed, as it accurately identifies the nature of data and advises against the notion of ‘data ownership’ and any other copyright-like rights.

Nevertheless, DEK argues that data should not be referred to as ‘counter-performance’ provided in exchange for a service. Recommendation no. 6 refers to offering alternatives for consumers to release their data for commercial use (like pay options). The recommendation is firm and ignoring any future legal clarification of this issue from the Court of Justice. Taking into account the EU legal system, such an approach is inadvisable, as all Member States will have to follow the interpretation of the Court of Justice in the case it will clearly consider as unacceptable under GDPR the situations of “bundling” consent with acceptance of terms or conditions, or those of “tying” the provision of a contract or a service to a request for consent to process personal data that are not necessary for the performance of that contract or service.

DEK considers that “the right to digital self-determination in the data society also applies to companies and legal entities and – at least to some extent – to groups of persons (collectives).” This represents a complex legal institution which deserves an ample debate before being enacted. 

The real objective of certain recommendations.

Many recommendations propose amendments to existing legislation, at national and EU level. Recommendation no.2 provides a clear view of the logic in the background. While it recognises the obvious fact that “data protection law as well as other branches of the legal system (including general private law and unfair commercial practices law) already provide for a range of instruments that can be used to prevent such ethically indefensible uses of data”, the lack of enforcement of these instruments is deplored, especially against “market giants”. 

When laws aren’t enforced, usually an assessment is carried out to find out the reasons for the implementation failure and then necessary measures are adopted (from guidelines and ensuring the right tools for the responsible authorities to adjust the rules to reality). New rules are (or should be) a last resort. Adopting additional rules in such a situation, especially with economic impact, could only be classified as an unwise approach of the respective government. 

But the recommendations unveil another aim, which is linked with one of the guiding objectives of the document (“the digital sovereignty of both Germany and Europe”). One can understand it as a rent-seeking use of horizontal regulations, not to achieve common sense objectives (like general public interest protection), but to unfairly intervene in the market and to use regulations that are applicable to all players in the market, big or small, as a “protectionist tool”.

Pursuing a protectionist approach may bring, on a short term, certain benefits (usually economic ones) for certain countries or parts of their economies. But, on medium and long term, this affects (sometimes irreversibly!) the players in the market which weren’t the “target” of such measures. It could be named as a “boomerang effect”. Many of the recommendations that follow this approach induce the perspective of an unfriendly business environment in the EU. Of course, potential political decisions in this sense are worrisome for the developer community. 

Data sharing vs. privacy and competition policy rules.

Along the same line, recommendations no. 27 and 29, on data sharing schemes, try to find innovative solutions for enabling data sharing and use of data in common for companies pushing the limits of competition law and altering the national civil law framework. Such B2B data sharing initiatives fall under the scrutiny of the antitrust rules, as to avoid any risk of collusion. Following strictly the EU perspective, the principles enshrined in art. 101 TFEU obliges policymakers to be extremely cautious when considering these recommendations.

Regarding business-to-government (B2G) relations, DEK seems more circumspect, but keeps the protectionist stance and doesn’t exclude statutory data access rights, recommending that the supporting impact assessments should take into consideration the distribution of market power and the strategic interests of German and European companies compared to those of companies in third countries. 

Recommendation no. 22, on data portability proposes a prudent approach in expanding the scope of art. 20 GDPR, based on a surprisingly judicious attitude, which,unfortunately, is not reflected by a large part of the document.

The next recommendation, no. 23, proposes for certain sectors, like messenger services and social networks, interoperability or interconnectivity obligations, but “designed on an asymmetric basis, i.e. the stringency of the regulation should increase in step with the company’s market share”. Such a measure could be difficult to implement, due to technical limitations and requires a careful assessment, including on its effects on the targeted markets.

A failed test: Proportionality and future-proof.

Certain proposed measures ignore the need for a careful assessment of the technological possibilities and future development and the potential impact of irreversible measures, which could preclude requisite objectives, like protection of public interest and public security. An example is the prohibition of de-anonymisation of anonymised data, which is proposed (recommendation no. 20), as a strict measure (under criminal penalty) for an absolute protection of data. 

Of great concern is also the outlook behind recommendation no. 21 (regarding data management and data trust schemes). It suggests an intervention at EU level. While the general objective of such regulatory intervention seems reasonable, the suggested course raises questions about imposing stringent solutions which have the potential to limit economic freedom/freedom of contract and consumer choice. Also, a technological – neutral regulation, which is the key to protect and promote innovation, should always be the legislators first choice.  

The necessity of performing ex-ante impact assessments seems completely disregarded.  Putting in balance the cost of the regulatory burden for the economic operators versus the regulatory objectives is essential for a sound political decision. This allows the policymakers to explore alternative ways to achieve the proposed objectives, thus avoiding red tape. Recommendation no. 13, on the introduction of standardised icons for consumer devices, is an illustrative example of a measure which undoubtedly requires a rigorous ex-ante impact assessment, in order to determine its pertinence.

Main Problematic Aspects Of The Section On Algorithmic Systems

Speculation vs. observation and measurement.

DEK perspective for algorithmic systems is (narrowly) focusing on output and human involvement. The risk-adapted regulatory approach is a clear translation of the precautionary principle, which is exclusively applied throughout.

Horizontal vs. vertical regulation.

The one-size-fits-all approach is strikingly ignorant of the complexity of the subject-matter, notwithstanding the undoubtedly technical knowledge of DEK (as substantiated by the classifications and descriptions used in the document).

DEK recommends a general five-layered control system, implemented by a mandatory labelling scheme for all algorithmic systems, which should be imposed by an EU Regulation (Regulation on Algorithmic Systems, EU-ASR). 

It is envisaged as a horizontal EU legal framework, containing prescriptive rules on design and use of algorithmic systems, transparency, users rights obligations, supervisory institutions and structures. Recommendations no. 36-45 are the most relevant in this sense. 

Recommendation 43 indicates also that the horizontal framework is foreseen to be accompanied by sectoral instruments, both at EU and Members States level.

Additionally, there are more measures foreseen to complement the horizontal and sectoral legislation, such as an Algorithmic Accountability Code (recommendation no. 59) or standardization measures (“quality seal”-recommendation no. 60; recommendation no. 63). Specific ‘binding normative framework for media is suggested, establishing a licensing procedure for online platforms (recommendations no. 65-66). These recommendations envisage EU level, too.    

A clash with existing laws and legal uncertainty.

DEK is considering the correlation (which could be seen more as a collision) of the proposed Regulation with GDPR, proposing significant amendments to GDPR (e.g. art. 22). Recommendation no. 53 is urging for an expanding scope of anti-discriminatory legislation. 

The algorithmic systems are currently subject to various legal regimes. Given the ample scope of this new horizontal framework and its intricate impact, a comprehensive assessment of the impact on the existing EU legislation is imperative, but DEK’s Opinion doesn’t treat this prerequisite. The existing legal norms which are governing the products and services and the issues under the scope of the proposed Regulation already provide the necessary regulatory response for many of the envisaged hazardous situations. Many challenges around safety are already addressed by the legal norms and standards applicable to systems engineering field. Of course, as technology and society advance over time, the legislators should keep the law up to date, if the current norms are obsolete or aren’t fit to attain their objectives anymore. But the lawmakers’ intervention should always be fit for purpose and coherent. 

Even the horizontal Regulation is foreseen as ‘lex specialis’, prevailing any other relevant provisions, there is a high risk of general legal unclarity and of undermining the legal security offered at present by the existing legal framework, affecting all economic sectors. 

The proposed approach, which is covering the whole spectrum of algorithmic systems (from the most simple ones to the most advanced AI), used overall, in every part of the economy and of society and at all levels, entails an indisputable risk of a legal disarray.

A more judicious approach.

The Expert Group on Liability and New Technologies – New Technologies Formation, set up by the European Commission, presented recently a report entitled “The Liability for Artificial Intelligence and other emerging digital technologies”, which compares various aspects of existing liability regimes in EU, based on an analysis of the relevant national laws and specific use cases. 

One of the main findings of the report is the following: 

“It is therefore necessary to consider adaptations and amendments to existing liability regimes, bearing in mind that, given the diversity of emerging digital technologies and the correspondingly diverse range of risks these may pose, it is impossible to come up with a single solution suitable for the entire spectrum of risks.”

Bitter Conclusion 

In the light of the above, it seems unworthy to further explore the recommendations on algorithmic systems. 

Should this be the ‘European Path’? 

What is the purpose of this work?

Protect EU citizens from imagined harm?

DEK assumes that any technology based on algorithmic systems is potentially harmful and therefore strict rules should be enforced to prevent any possible harm may occur, be it more or less evident or just presumed. How are these measures going to ensure that European citizens and European society in general would benefit from all technological solutions developed at global level? How would they be able to enjoy the progress in the Age of AI? 

Promote EU business success globally?

It is not clear how are these measures going to help digital (and not only!) European companies to grow and to be competitive in a global market. Also, it seems that the objective to ensure and foster digital innovation and in particular AI development in Europe is missed, as the approach is driven in the extreme by the precautionary principle.

Protectionist/create “EU champions”?

This objective is cleared stated. 

Establish ethics for industry inside and outside the EU?

Any technological solution is embedded with the values of its creator. Software developers are a diverse workforce and their work is global. This two essential aspects are overlooked by DEK. How is going this mix-up to help the EU to stay competitive at a global level? It’s self-evident that some parts of the world will continue to develop technologies according to their vision and never “adopt” the European one. And certain technological solutions which wouldn’t be possible to develop in the EU will be available on other markets. In such a scenario, the negative impact on the European developer community will be significant. But the ultimate repercussion will be on European consumers. 

Related Content

Developers Alliance Joins Call for EU Policymakers to Swiftly Adopt the Extension of the Interim ePrivacy Derogation

Developers Alliance Joins Call for EU Policymakers to Swiftly Adopt the Extension of the Interim ePrivacy Derogation

Developers Alliance’s Reaction to the Political Agreement on the New EU Law on Liability for Defective Products

Developers Alliance’s Reaction to the Political Agreement on the New EU Law on Liability for Defective Products

A Busy Regulatory End of the Year in Europe 

A Busy Regulatory End of the Year in Europe 

Join the Alliance. Protect your interests.

©2019 Developers Alliance All Rights Reserved.