114,99 €
This book celebrates the 40th anniversary of the creation of the CRID and the 10th anniversary of its successor, the CRIDS. It gathers twenty-one very high quality contributions on extremely interesting and topical aspects of data protection. The authors come from Europe as well as from the United States of America and Canada. Their contributions have been grouped as follows:
1° ICT Governance;
2° Commodification & Competition;
3° Secret surveillance;
4° Whistleblowing;
5° Social Medias, Web Archiving & Journalism;
6° Automated individual decision-making;
7° Data Security;
8° Privacy by design;
9° Health, AI, Scientific Research & Post-Mortem Privacy.
This book is intended for all academics, researchers, students and practitioners who have an interest in privacy and data protection.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 864
Veröffentlichungsjahr: 2022
Pour toute information sur nos fonds et nos nouveautés dans votre domaine de spécialisation, consultez nos sites web via www.larcier.com.
© Lefebvre Sarrut Belgium SA, 2021
Éditions Larcier
Rue Haute, 139/6 – 1000 Bruxelles
Tous droits réservés pour tous pays.
Il est interdit, sauf accord préalable et écrit de l’éditeur, de reproduire (notamment par photocopie) partiellement ou totalement le présent ouvrage, de le stocker dans une banque de données ou de le communiquer au public, sous quelque forme et de quelque manière que ce soit.
9782807933477
La Collection du CRIDS (anciennement « Cahiers du CRID ») a pour objectif de diffuser des études et travaux scientifiques dans le domaine du droit des technologies de l’information (contrats de l’informatique, commerce électronique, propriété intellectuelle, protection des données à caractère personnel et des libertés fondamentales, réglementation des communications électroniques…).
Chaque ouvrage traite un thème de recherche dont les aspects théoriques et pratiques sont développés par un ou plusieurs spécialistes de la matière. Le CRIDS espère ainsi mettre à la disposition tant des chercheurs que des praticiens en quête de réflexions et d’informations un ouvrage de synthèse, clair et complet, dans le domaine du droit et du numérique.
Comité scientifique :
Herbert Burkert (Professeur, Université de Saint-Gallen, Suisse) – Santiago Cavanillas (Professeur, Université des Baléares) – Jos Dumortier (Professeur, K.U. Leuven) – Yves Poullet (Professeur, UNamur) – André Prüm (Professeur, Université de Luxembourg) – Pierre Trudel (Professeur, Université de Montréal) – Michel Vivant (Professeur, Institut d’Etudes Politiques de Paris).
Comité de rédaction :
Alexandre de Streel (Professeur à l’UNamur, directeur du CRIDS) – Cécile de Terwangne (Professeure à l’UNamur, CRIDS) – Florence George (Chargée de cours à l’UNamur, CRIDS, avocate) – Benoît Michaux (Chargé de cours à l’UNamur, CRIDS, avocat) – Robert Queck (Maître de conférences à l’UNamur, CRIDS).
Directeur de la collection :
Hervé Jacquemin
Professeur à l’UNamur (CRIDS)
Avocat au barreau de Bruxelles
Directeur adjoint :
Jean-Marc Van Gyseghem
Directeur de recherches au CRIDS
Avocat au barreau de Bruxelles
Secrétariat :
Stéphanie Henry
Faculté de Droit – UNamur
Rempart de la Vierge, 5 – 5000 Namur
Tél. : (32) 81 72 47 93 – Fax : (32) 81 72 52 02
Introduction
CHAPTER 1ICT Governance
The European Data Protection Regulation and Information Governance
Herbert Burkert
The European Group on Ethics in Science and New Technologies and Data Protection in the EU
Herman Nys
CHAPTER 2Commodification and Competition
Paying with Personal Data: Between Consumer and Data Protection Law
Antoine Delforge
The GDPR: A Shield to a Competition Authority’s Data Sharing Remedy?
Thomas Tombal
CHAPTER 3Secret surveillance
The Half-Way Revolution of the European Court of Human Rights or the ‘Minimum’ Requirements of ‘Law’
Bart Van Der Sloot
CHAPTER 4Whistleblowing
Whistleblowing: Threat or Safeguard for Data Protection in the Digital Era?
Amélie Lachapelle
CHAPTER 5Social Medias, Web Archiving & Journalism
To Scrape or Not to Scrape? The Lawfulness of Social Media Crawling under the GDPR
Catherine Altobelli, Nikolaus Forgó, Emily Johnson & Antoni Napieralski
Web Archiving in the Public Interest from a Data Protection Perspective
Alejandra Michel
Processing of personal data for “journalistic purposes”
Cécile de Terwangne & Alejandra Michel
CHAPTER 6Automated Individual Decision-Making
The GDPR and Automated Individual Decision-Making: Fair Processing v. Fair Result
Manon Knockaert
CHAPTER 7Data Security
Risk as the Cornerstone of Information Security and Data Protection
Jean-Noël Colin
How to Deal with the Human Factor in Information Security?
Charlotte Durieux, Alain Ejzyn & Anne Rousseau
“Technical and Organisational Measures” – A Systematic Analysis of Required Data Protection Measures in the GDPR
Dag Wiese Schartum
CHAPTER 8Privacy by Design
Privacy-by-Design in Intelligent Infrastructures
Manon Knockaert, Maryline Laurent, Lukas Malina, Raimundas Matulevičius, Marinella Petrocchi, Mari Seeba, Qiang Tang, Aimilia Tasidou & Jake Tom
CHAPTER 9Health, AI, Scientific Research & Post-Mortem Privacy
Health Care Data in the U.S., the GDPR Exemplar and the Challenge of AI
Nicolas Terry
Artificial Intelligence and Discrimination Based on Prediction of Future Illness
Sharona Hoffman & Mariah Dick
Artificial Intelligence in Healthcare and the Impact of COVID-19
Stefaan Callens & Guillaume Pomes
The Processing of Personal Data for Scientific Research Purposes in Medicine. Some Aspects of the General Data Protection Regulation: Between Law and Ethics
Carla Barbosa
Invigorating the Principles of Consent and Data Privacy in the Medical Field through Gamification and Genome Donation
Hortense Gallois, Yann Joly & Vincent Gautrais
What About Post-Mortem Digital Privacy and Personal Health Data Protection?
Gauthier Chassang
Prospective View
Data protection or privacy?
Yves Poullet
1979. In September of that year, the CRID (Centre de Recherche Informatique et Droit – Research Centre on Computer & Law), founded by Yves Poullet, organised its first conference entitled "Data Banks, Enterprises and Privacy". The decision to fully embrace this new problematic was clearly an excellent one, since more than forty years later, the issue of data protection is not only still an important and topical political issue, but is also at the heart of numerous research projects in all countries, involving a large number of disciplines that need to interact with each other.
2009. That year the CRID formally joined forces with the CITA (Cellule interdisciplinaire de Technology Assessment – Interdisciplinary Unit on Technology Assessment) and the GRICI (Groupe interdisciplinaire en Communication et Internet – Interdisciplinary Unit on Communication & Internet), two other entities of the University of Namur, to form the CRIDS (Centre de Recherche Information, Droit et Société – Research Centre on Information, Law & Society), a member of the NaDI (Namur Digital Institute), a Research Institute of the University of Namur which is devoted to the study of information and communication technologies.
2019. Within the framework of the celebration of the 40th anniversary of the creation of the CRID and the 10th anniversary of the creation of the CRIDS, the idea naturally arose to launch a call for contributions aimed at offering authors the opportunity to highlight an aspect of data protection that they felt deserved an in-depth analysis.
We received twenty-one very high quality contributions on extremely interesting aspects of data protection, which have been brought together in this book. The authors come from Europe as well as from the United States of America and Canada. Their contributions have been grouped as follows, on the understanding that all choices are always arbitrary and subject to debate:
1) ICT Governance;
2) Commodification & Competition;
3) Secret surveillance;
4) Whistleblowing;
5) Social Medias, Web Archiving & Journalism;
6) Automated individual decision-making;
7) Data Security;
8) Privacy by design;
9) Health, AI, Scientific Research & Post-Mortem Privacy.
I wish you all a very fruitful read!
1 University of Namur, Faculty of Law, CRIDS/NaDI; Member of the Brussels Bar.
In view of the importance of information in our societies, the European Data Protection Regulation2 (in short henceforth the Regulation) deserves to be put in a broader context. We may no longer live in a time when comprehensive regulations of a whole area like the Code Napoléon seem feasible or even desirable. What we see instead is an issue by issue approach where legislatures pick up what is deemed to be politically relevant at a given moment. Discovering inconsistencies is – as with today’s software – left to the feedback from the users, that is from the citizens and eventually from the courts. If that feedback is then deemed politically relevant (or digestible), legislatures may provide an update, unless the intervening court has the power to intervene more forcefully. While this may seem to be the current approach to regulations, we – who we can lean back and criticize – should nevertheless attempt now and then to gain at least a more comprehensive view on such regulations. The context I choose for this purpose is Information Governance, its tools and policies.
There are many meanings of governance. Information Governance here is seen as the functional whole of tools and policies by mainly public sector actors to manage information flows in society ensuring a dynamic balance of information production, distribution, and consumption while maintaining an equally balanced informational power distribution.
To better understand this broader context, it is useful to first look back in the history of data protection laws.
The first data protection law ever, the data protection act of the German province of Hesse3, not only addressed the privacy of citizens faced with governmental power. It had taken a more comprehensive view of the changes that had slowly become visible with technological change in the governance structure of the state.
We have to remember that, at that time, computer technology had been synonymous for big computer centers with centralized data processing that were guiding programming and communication. The government of the province – as in other provinces of Germany – aimed to keep applications and procedures in its hands. Local communities on the other hand felt that their power to make political decisions would continue to erode even further. It was them, they argued, who were closer to the citizens, they would understand better what was needed politically. They pointed to the constitutionally enshrined principle of subsidiarity, a principle adopted in Germany to avoid the implications of too much central power experienced between 1933 and 1945. So not only the privacy of citizens was at stake but also the constitutionally guaranteed role of local communities.4
Technological change also showed its impact on the relationship between the government, the governing party or parties and the opposition: by governing, the government creates information. Those who govern in parliament are privileged to share this information. The opposition has to use special and limited devices to get access to such information, like Parliamentary Questions or investigative commissions. The new technology, in addition, provided new policy making possibilities: simulation with administratively collected data. This new opportunity of informational power for governments and their parties needed a counterbalance. So, the Hesse “data protection law” also enabled opposition parties to have access to these opportunities.5
At the federal level, the first Constitutional Court Decision on data protection handed down by the German Constitutional Court in 1983 had stated: “If individuals cannot, with sufficient certainty, determine what kind of personal information is known to their environment, and if it is difficult to ascertain what kind of information potential communication partners are privy to, this may seriously impair the freedom to exercise self-determination. In the context of modern data processing, the free development of one’s personality therefore requires that the individual is protected against the unlimited collection, storage, use and sharing of personal data.”6
The Court thus established that famous right to “informational self-determination.” In doing so it addressed the public side of the private right, and the political impact of privacy protection for the democratic functioning of society.
The French Data Protection law of 1978 from its very beginning had recognized such political implications giving its data protection law the title of a law relating to information technology, data files and liberties.7
All these elements, the Hesse Act with its consideration of federalism and parliamentary power, the German Constitutional Court highlighting the importance of political participation, and the French opening of the privacy act to civil liberties in general had shown early on that privacy is not a solitary issue but was embedded in a political system that faced broad challenges from technological change. To face this change needed a reconsideration of how we are governed, of how political power generated from information could be adequately balanced under these new conditions. Privacy was already then seen to be embedded in a broader context of governing information power.8 Information Power is a potentiality created by an informational advantage that can be used at the right moment to accomplish an envisaged aim; advantage meaning to have the information earlier (time advantage), to have more information (advantage of quantity), to have better information (advantage of quality), to have more adequate information (advantage of pertinence), to be in a position to use it (advantage of opportunity), to arrive at a better interpretation (advantage of contextual knowledge) and to use it adequately (advantage of choosing the right mode). Not all of these advantages need to be present to possess and exercise information power. However, the more of these advantages come together, the more difficult it is to rein in informational power. To observe and balance such developments is the task of what I have referred to as Information Governance.9
Information Governance manages information flows and their effects on power distribution in society. In this context, Information Governance is public governance (in contrast to for example corporate governance) exercized mainly by public sector actors (either alone or in various cooperative forms with the private sector, hybrid actors, the general public, interest groups etc.). The term is used descriptively as well as normatively. In its normative form, it is referred to as Good Governance.10 Good Information Governance is expected to be consistent, predictable, comprehensive and effective. To master these expectations, Information Governance makes use of other governance systems. The main – but not the exclusive – tool for Information Governance is the legal system and its instruments. In other contexts, colleagues and I have been referring to this body of legal instruments used for Information Governance as Information Law.11Other instruments may be encouraging or discouraging policies, the steering of investments and other financial or economic instruments, educational encouragements etc.
When borrowing from law Information Governance is borrowing both advantages and disadvantages from the legal system. I will comment on some of these aspects of the Regulation in part 2.1.
Using the legal system with global intentions may lead to unforeseen political consequences. I will comment on some of these policy implications of the Regulation in part 2.2.
And finally, the Regulation is not a (legal) solitaire in the arsenal of Information Governance. I will also comment on the need of a more comprehensive approach to make Information Governance useful (Part 2.3).
The Regulation has replaced the Directive. This replacement was intended to provide a more comprehensive set of rules for the Member States of the Union by limiting derogations in national legislations.
Whether it has reached this goal remains doubtful in view of the many opening clauses that still give Member States regulatory leeway.12
Although the contents of the Regulation had been extended and made more detailed, as any legal instrument the Regulation still needs interpretation. As before with the Directive, the last interpretation lays with the European Court of Justice. The relationship between the Member States’ courts, in particular their highest courts, and the European Court of Justice is highly complex. It thus remains time consuming to arrive at final interpretations. This delay in interpretation is not the only idiosyncrasy the Regulation as Information Governance has inherited from European Law.
In addition, the Regulation inherits from the Directive a legal construct central to its philosophy, a construct which in turn had been inherited from the early national data protection laws: the Consent Model. Consent is an internal occurrence that manifests itself through external indicators. The model is based on the assumption that equals meet each other and decide autonomously on their transactions. When the first privacy protection laws had to be drafted, information flow was conceived as an exchange process. Thus, the appropriate legal instrument used for such exchange processes was consensus, a willful decision to undertake an exchange. With consensus, data protection not only borrowed from a basic concept of Civil Law, it also installed a key “legitimizer” for using personal information. The legal system is well aware that in the real world the pure appearance of such a model is scarce. Consequently, the legal system has built complex systems around the assumption of consent to compensate for deficiencies. Consent, for example, has to be informed consent. A construction like this is like a string in a sugar solution: The modifications of the basic assumption of “equal meets equal”, qualifying legislations like consumer protection or rent laws, they all form ever more complex crystals around this string to protect the basic fiction. The development of consent from the national data protection laws, to the Directive and finally to the Regulation is a history of such complexities. While the Regulation may have raised the barriers to assume informed consent and freely given, the daily practice of data processing continues to minimize consent to a click, or, at most, a series of clicks. The core of this problem is not consent as such, it is, as pointed out, a common pattern in law. The core of the problem is that consent – like anonymization which creates similar problems – is the great facilitator for the handling of personal information and invites exploitation. Consequently, many legislations have excluded consent as a legitimation, at least from their public sector data protection laws.
Those are just two examples of the problems the Regulation faces or rather continues to face as a legal instrument within the context of Information Governance.13 Wherever Information Governance borrows from the legal system, it also borrows its problems, and in the case of the Regulation this may well affect its efficiency as an Information Governance tool.
A legal instrument addressing such a fundamental issue as flows of personal information and originating from an institution that is involved in global exchanges cannot remain just a legal instrument; it is also a political instrument. Political instruments are potent instruments of Information Governance.
The Regulation is conscious of being an instrument of information policy. It has developed into a condition for international commerce and services. Whoever wants to remain involved with the European Union and its citizens has to consider the Regulation as a model. And it seems to be working. Many countries seem to follow this model; whether this is a substantive effort or whether this is political mimicry would need a deeper analysis.14 The Regulation has maintained the power of the Commission to decide on the adequacy of data protection in a third country. The Regulation does provide for a set of possibilities to have information transferred even when there is no such decision. Still, the adequacy decision according to Art. 44 and 45 of the Regulation substantially facilitates exchanges. Aspiring such a decision is the preferred way of entry to European personal data. Adequacy operates as a facilitator like consent and anonymization, and it is therefore equally attractive. Being covered by such a decision becomes a highly desired aim in trade policy. And since the decision lies with the Commission, it provides the Commission with bargaining power. The Regulation (as did the Directive) provides a set of criteria to arrive at such a decision. Those making such a decision are also well aware of the implications for global trade. Considering privacy and considering trade creates conflicts which not always lead to the best sustainable decisions. The Safe-Harbor solution found for the Directive in relation to the USA15 did not prove to be sustainable. Even more so, it maintained the possibility that supervisory authorities can still make their own assessment of the adequacy of the of data protection in the recipient country, although it is up to the European Court to make a final decision.16 The Safe-Harbor construct had been replaced by the EU-US Privacy Shield decision which was a bit more explicit than the previous construct by, for example, examining the national security regulations in the US more closely.17 The Privacy Shield, however, did not survive either18. While this is not the place to discuss the Commission’s decisions or the European Court judgments, one has to remember that the Privacy Shield decision had still been based on the Directive, and that it had already been severely criticized by the European Data Protection Board.19 The reference to EU-US relations is just an example that the regulation cannot escape political discussions. This has been reconfirmed when the Commission issued an adequacy decision for Japan20, again criticized by the European Data Protection Board.21 As one of the many interest groups that observe such decisions, the “Global Alliance for Genomics and Health”, has observed very pointedly: “The EU-Japan Economic Partnership Agreement, which entered into force in February 2019, likely influenced the outcome of the decision. Although the EDPB and the EC were critical of some aspects of the Japanese privacy framework, free data flow between Japan and the EU facilitates the execution of the bilateral trade agreement, which may explain the flexible process. [...] Conversely, the WP29’s 2014 Quebec Opinion did not benefit from such flexibility to clarify the relationship between the provincial and federal data protection law and thus achieve adequacy. This suggests that the EC prioritized trade over a GDPR-like data protection scheme. […] However, the EC should be open and transparent about its flexibility and avoid unfettered discretion, which could create an impression it bases its decision on extraneous considerations and predetermined outcomes.”22
The Regulation cannot escape being seen as an Information Policy instrument, and as such it does not necessarily provide the kind of consistency and predictability as would be required for an Information Governance instrument.
Obvious political flexibility in using the Regulation has other consequences as well. It may erode the value of the governance instrument Regulation and invite countries while not necessarily aiming at adequacy to use the Regulation selectively for legitimizing their own information policies.23 We already see the Regulation being welcomed as a reference to argue for the establishment and defense of national cloud systems24, as well as for undertakings to strive for national informational autarchy, at least as far network structures are concerned.25 Similarly, the emphasis on data security in the Regulation serves as a justification for developing defensive as well as preemptive cyber security measures.26 Finally, while (by legal necessity) the Regulation is excluding processing for national security purposes from its application area, this apparent blank is being exploited to legitimize the exclusion of national security processing on a far broader scale internationally.27
These references to the Regulation cannot be criticized as a deficiency of the Regulation. Conscious Information Governance, however, should be realistic and aware of such “re-uses” of severed policy elements and take a pro-active stand in international discussions.
Finally, the Regulation emphasizes restricting tendencies in Information Governance approaches: Its regulatory ideal aims at typologies for information handling in the various economic sectors that are deemed to be acceptable, operating under the supervision of safeguarding institutions. Moving outside these typologies creates interpretative risks that have to be taken by the responsible organizations first, until final court decisions may provide clarity.
Decision making criteria established by the Regulation do provide inroads for other values such as freedom of expression.28 However, in the interpretation of the European Court of Justice such references act as an exemption of a default rule, rather than as an equal counterweight in a weighing process that gives each of the fundamental rights equal opportunities.29
The Regulation also contains various transparency instruments to strengthen the position of the data subjects. Again, however, they serve as a means for the aim “privacy protection”, and transparency is not acknowledged as a societal value as such by the Regulation.
Several national legislators have long taken a different approach: In Canada for example on the national30 as well as on the provincial level31, privacy protection and access to government information have been implemented as coordinated legislative acts to take a balanced approach to information flows in society.
Establishing a comparable system in EU law has met with resistance arguing a lack of legal competence. In selected areas such competence for access to information (or freedom of information) was eventually recognized, such as for access to environmental data32, access to documents of EU institutions33 or access for reuse of government information34. But there is no all compassing regulation for access to government information which falls within the scope of Union Law, as in the approach taken by the Regulation in its Art. 2 (a). This might be called a problem of EU competence. But it should be remembered that the then European Commission of the European Communities as late as March 1981 considered data protection a competence issue and merely recommended to its Member States to ratify the Council of Europe Convention 108 - Commission Recommendation 81/679/EEC.35 In EU policies, competence to regulate has always been an issue, but an issue with great flexibility.
There are even more challenging developments ahead: In the traditional model access is granted to government information (with certain limitations). Information of the private sector in this model is accessible when it is located in the public sector (again with limitations), in other words, only where the public sector is in possession of private sector information is there a possibility to access this information. Otherwise, direct access to private sector information is only accessible when there is a special relationship between the requester and the private sector entity that is recognized by law. But this situation is changing since quite some time. Based on the Constitution of South Africa, the Promotion of Access to Information Act of 2000 provides a direct right of access to private sector information.36
In the system of Information Governance, privacy law is developing as a useful instrument to counterbalance increasing informational power in the private sector in as far as personal data is concerned. Access to government information legislation and access rights in public sector data protection laws provide useful instruments to counterbalance increasing informational power in the public sector and, provided the public sector is in possession of private sector information, also to further counterbalance private sector informational power. It is direct access to private sector information which will improve the much-needed comprehensiveness for Information Governance.
The Regulation in the context of Information Governance by definition has to carry the burdens of EU law and classical legal models. This might be called an inherited deficiency of the Regulation.
The Regulation has a strong information policy component. This is an integral component of Information Governance. Nevertheless, what is needed is a conscious handling of these policy effects and an effort to ensure predictability and consistency.
The Regulation by its very purpose emphasizes the restrictive mode of Information Governance. However, information flows in societies require a comprehensive approach that is also recognizable as such. Within the EU framework, as in most of its Member States, instruments favoring information flows in a way that they can counterbalance informational power are fragmented and need completion and comprehensive concepts. Looking back at the beginnings of data protection this is still a promise to be fulfilled.
But how then should we imagine the future of Information Governance?
The outside observer has it easy to criticize the momentous approach of information policies that pick up current debates to maximize political gain. And, not without irony, in the case of data protection in Europe picking up a “current debate” had taken decades. Still in the spirit of a “separation of labor” – and repeating my initial observation – those who have the privilege of taking a more detached view have the obligation to come back, again and again, if necessary, to the need for a more comprehensive view. Such a view could help making information policies more proactive; it could help to arrive at better balancing solutions for inevitable conflicts of values and for addressing power asymmetries; it could help optimizing information flows in our societies and provide for a more stable and yet dynamic environment for information power. Eventually it could also help providing better practical solutions for complex interactions of different information interests as for example ongoing conflicts around public sector information where commercialization interests both of the public and the private sector meet with privacy concerns, copyright demands, the public’s right to know and interests in furthering the information economy to the benefit of a country or a region.
Helpful steps for improving Information Governance would be:
Mapping information flows is a technique that seems to be predominantly used today for forensic purposes, for example in fighting money laundering or for detecting misuses of private information.37 And it is a tool used for national security purposes.38 Nevertheless, a broader use would, for example, help to visualize critical areas where information congregates and would allow to simulate the consequences of legislative interventions.
What are, for example, the tools that are available to distribute information in society, to block and open channels, what are the framework conditions for implementing them, be they political, legal, economic or cultural? It seems clear that law makers resort to modular elements of law making that have already proved themselves useful in other contexts: Installing information obligations, determining those obliged to provide information, installing an institution to collect this information, providing this institution with rights and obligations, and addressing noncompliance is such a set of such elements used in areas ranging from food and drug safety, oversight of social security obligations and money laundering. However, I have not yet come across a systematic overview of such inventories that could be made useful for Information Governance as a societal tool.
It is the area of information and communication that has seen the most dramatic changes formed or at least influenced by technological innovations. Placing the Regulation in the context of Information Governance with its specific references to technological solutions39 is already an example of such technology awareness: To master and indeed govern changes initiated, modified or magnified by technological progress Information Governance has to be generally technology aware. At the same time, it has to master the tensions, conflicts and contradictions between socially and technologically driven change.
Other elements of improved Information Governance are not specifically tailored to the activities of Information Governance, but are acknowledged as elements of any Good Governance.
Governance mechanisms implemented need to be monitored as to their effectiveness and efficiency; monitoring requires the installation and use of feed-back systems. In short Information Governance needs to be designed and operated as a learning system being able to digest the (meta) information that is being produced by Information Governance.
Internal feedback mechanisms need an evaluative framework to be able to judge on the feedback created. Information Governance, as any governance system, needs evaluative criteria for such judgements. Normative expectations have already been pointed out above when introducing the concept of Good Governance. Information Governance is expected to be consistent, predictable, comprehensive and effective. While those criteria may be qualified as formal, there are also more substantive criteria to be considered as developed by the legal systems that integrate elements from governance systems based on belief systems with secular or religious roots.
The arrival of cross-national information technologies like the telegraph and the telephone had created visions of not only global communication, but global understanding. Environmental governance is increasingly perceived and handled as a necessarily globally operating steering system. Space technology has turned cosmic philosophical reflections into needs for cosmic policies. This cosmic element in Governance reflections will gain further importance and may change Information Governance from an instrument that may temporarily improve a competitive position, like the Regulation, into a mechanism that aims at a sustainable qualitative improvement of information distribution on a global scale.40
Putting the Regulation in the broader context of Information Governance has not only revealed a number of deficiencies such as those inherited from the legal system in which the Regulation has to operate. While this sketch could not go into the full details of Information Governance structure, tools and procedures, this contextualization has already revealed the need for an information policy that is aware of mid and long-range consequences of such instruments and has shown the need to keep a delicate balance between the intrinsic intention of information to flow and its controls. Such a quest for comprehensiveness, however, has also made it clear how much more needs to be done for the “context” in which the Regulation had been put to make it a true context: Information Governance needs continuous work on its instruments and orientation from evolving Good Governance practices, the most important of those, perhaps, is an understanding of its global responsibility that goes beyond reaching an optimal position in global competition.
1 University of St. Gallen, Research Center for Information Law. All electronic resources quoted have last been verified on 29 February 2020.
2 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) OJ L 119, 4.5.2016, p. 1-88.
3 Hessisches Datenschutzgesetz vom 7.10.1970. Gesetz- und Verordnungsblatt 1970 I, 625ff. (HDSG-1970).
4 See § 6 (2) HDSG-1970.
5 § 6 (1) HDSG-1970.
6https://www.bundesverfassungsgericht.de/SharedDocs/Entscheidungen/EN/1983/12/rs19831215_1bvr020983en.html
7 Loi n° 78-17 du 6 janvier 1978 relative à l’informatique, aux fichiers, et aux libertés.
8 Expressed, for example, in the need to address informational asymmetries: Tiziana Croce: Big data? A question of balance between privacy, security and information power. In: Annali della Facoltà Giuridica dell’Università di Camerino – n. 7/2018, 39 (https://pubblicazioni.unicam.it/retrieve/handle/11581/406039/70636/Big%20data.pdf).
For Information Power as a geopolitical phenomenon see David J. Lonsdale, Information power: Strategy, geopolitics, and the fifth dimension, The Journal of Strategic Studies, 22:2-3 (1999), 137-157, 144f.
About Information Power being generated by the “digitally networked environment” so to speak behind the back and eventually outside the control of its users, see Jack M. Balkin: Information Power: The Information Society From An Antihumanist Perspective, 2010, p. 12. (https://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=5620&context=fss_papers).
9 In a business context, Information Governance is referring to maximizing the value of information while at the same time ensuring compliance to relevant compliance requirements. See in detail: Robert F. Smallwood: Information Governance: Concepts, Strategies, and Best Practices, Second Edition 2020.
10 In detail: G. Henk Addink: Good Governance, concept and context. Oxford University Press 2019.
11 Urs Gasser, Herbert Burkert: Regulating Technological Innovation: An Information and a Business Law Perspective. In: Rechtliche Rahmenbedingungen des Wirtschaftsstandortes Schweiz: Festschrift 25 Jahre juristische Abschlüsse an der Universität St. Gallen (HSG). Dike, Zürich, 2007. 503-523. Herbert Burkert: Information Law: From Discipline to Method Berkman Center Research Publication No. 2014-5, U. of St. Gallen Law & Economics Working Paper No. 2014-02 (https://ssrn.com/abstract=2402866).
12 Emilia Miscenic; Anna-Lena Hoffmann: The Role of Opening Clauses in Harmonization of EU Law: Example of the EU’s General Data Protection Regulation (GDPR). In: EU and Comparative Law Issues and Challenges Series 4 (2020): 44-61, 50f.
13 For a more in-depth analysis: Urs Gasser: Futuring Digital Privacy: Reimagining the Law/Tech-Interplay. In: Big Data and Global Trade Law (Cambridge University Press, forthcoming 2020).
14 See for example: Graham Greenleaf. Japan and Korea, Different paths to EU adequacy. (December 10, 2018). (2018) 156 Privacy Laws & Business International Report, 9-11 (https://ssrn.com/abstract=3323980).
15 Commission Decision 2000/520/EC of 26 July 2000.
16 ECJ (C- 362/14).
17 Commission Decision (EU) 2016/1250 of 12 July 2016.
18 Judgement of the Court (Grand Chamber) of 16 July 2020 in Case C-311/18 – ECLI:EU:C:2020:559.
19 European Data Protection Board: EU – U.S. Privacy Shield – Second Annual Joint review. Adopted on 22 January 2019.
20 Commission Implementing Decision (EU) 2019/419 of 23 January 2019 pursuant to Regulation (EU) 2016/679 of the European Parliament and of the Council on the adequate protection of personal data by Japan under the Act on the Protection of Personal Information.
21 European Data Protection Board, Opinion 28/2018 regarding the European Commission Draft Implementing Decision on the adequate protection of personal data in Japan, Adopted on 5 December 2018.
22https://www.ga4gh.org/news/gdpr-brief-japan-obtains-the-first-adequacy-agreement-under-the-gdpr/.
23 Nigel Cory: EU digital trade policy proposal opens a loophole for data protectionism | View [https://www.euronews.com/2018/07/16/eu-digital-trade-policy-proposal-opens-a-loophole-for-data-protectionism-view].
24 See for example the US Cloud Act: European Data Protection Board: ANNEX. Initial legal assessment of the impact of the US CLOUD Act on the EU legal framework for the protection of personal data and the negotiations of an EU-US Agreement on cross-border access to electronic evidence [https://edps.europa.eu/sites/edp/files/publication/19-07-10_edpb_edps_cloudact_annex_en.pdf] and US Department of Justice: Promoting Public Safety, Privacy, and the Rule of Law Around the World: The Purpose and Impact of the CLOUD Act. White Paper April 2019 [https://www.justice.gov/opa/press-release/file/1153446/download]; The International Association of Privacy Professionals (IAPP): How to comply with both the GDPR and the CLOUD Act [https://iapp.org/news/a/questions-to-ask-for-compliance-with-the-eu-gdpr-and-the-u-s-cloud-act/]. For the European perspective, see more recently: Dossier: Le cloud par-delà les nuages. Le Monde, 18 février 2020, 18f.
25 See e.g.: Alena Epifanova: Deciphering Russia’s “Sovereign Internet Law” Tightening Control and Accelerating the Splinternet. German Council on Foreign Relations. DGAP Analysis No.2 January 2020 (https://dgap.org/sites/default/files/article_pdfs/dgap-analyse_2-2020_epifanova_0.pdf).
26 See e.g.: Annegret Bendek; Martin Schallbruch: Europe’s Third Way in Cyberspace. What Part Does the New EU Cybersecurity Act Play? SWP Comment 2019/C 52, December 2019 (https://www.swp-berlin.org/fileadmin/contents/products/comments/2019C52_bdk_schallbruch.pdf).
27 See e.g.: Ioanna Tourkochoriti; The Transatlantic Flow of Data and the National Security Exception in the European Data Data Privacy Regulation: In Search for Legal Protection against Surveillance. In: University of Pennsylvania Journal of International Law 36:2 (2045), 459-524.
28 See e.g. Recitals 4, 65, 153 of the Regulation.
29 Commenting on the “privacy priority” interpretation of the European Court in contrast to the approach of the German Bundesverfassunsgericht: Thomas Hoeren: Anmerkung, MMR 2020, 111.
30 Access to Information Act (R.S.C., 1985, c. A-1) Last amended on August 28, 2019 (https://laws-lois.justice.gc.ca/PDF/A-1.pdf) and Privacy Act (R.S.C., 1985, c. P-21) Last amended on August 28, 2019 (https://laws-lois.justice.gc.ca/PDF/P-21.pdf).
31 Quebec: A-2.1 Act respecting Access to documents held by public bodies and the Protection of personal Information Updated to 31 December 2019 (http://legisquebec.gouv.qc.ca/en/ShowDoc/cs/A-2.1).
32 Directive 2003/4/EC of the European Parliament and of the Council of 28 January 2003 on public access to environmental information and repealing Council Directive 90/313/EEC (OJ L 41, 14.2.2003, pp. 26-32).
33 Regulation (EC) No 1049/2001 of the European Parliament and of the Council of 30 May 2001 regarding public access to European Parliament, Council and Commission documents (OJ L 145, 31.5.2001, pp. 43-48).
34 Directive 2013/37/EU of the European Parliament and of the Council of 26 June 2013 amending Directive 2003/98/EC on the re-use of public sector information (OJ L 175, 27.6.2013, p. 1-8).
35 For this development, see Kolb, Marina: The European Union and the Council of Europe, Basingstoke: Palgrave Macmillan 2013, 66ff.
36 For more details, see Richard J. Peltz-Steele: Access to Information in the Private Sector: African Inspiration for the US FOIA Reform. In: Villanova Law Review 63 (2018) 906-957.
37 See e.g.: Leite, G. Sobreira; Albuquerque A; Rogerio Pinheiro, P. Bessa: Application of Technological Solutions in the Fight Against Money Laundering–A Systematic Literature Review. Appl. Sci. 2019, 9, 4800.
38 See e.g.: Florian Roth, Jennifer Giroux, and Michel Herzog: Crisis Mapping in Switzerland: A Stakeholder Analysis. Special Report, Risk and Resilience Research Team. Commissioned by the Federal Office for Civil Protection (FOCP) Center for Security Studies (CSS), ETH Zürich 2013.
39 See Urs Gasser 2020 (FN 11 above).
40 As to the need for improved global policies in the era of globalization, see for example: Tingyang Zhao: Tianxia: tout sous un même ciel. Les Editions du Cerf, Paris 2018.
The European Group on Ethics in Science and New Technologies (hereafter EGE) started in 1991 as a modest ad hoc advisory body, the so-called Group of Advisers to the European Commission on the Ethical Implications of Biotechnology (GAEIB).2 A Commission Communication established the EGE in December 1997.3 The EGE’s remit expanded to include also communication and information technology. Article 7 of the so-called Patent Directive explicitly refers to the EGE: “The Commission’s European Group on Ethics in Science and New Technologies evaluates all aspects of biotechnology”.4
Until now the EGE has emitted 30 Opinions. In this contribution the Opinions of the EGE will be analysed in so far as they are relevant for data protection law in the EU.
Opinion n° 13 was the first Opinion that explicitly dealt with data protection. Already in previous Opinions data protection was referred to although these words were not used literally. For instance, in Opinion n° 11 of 21 July 1998 on ethical aspects of human tissue banking, the EGE referred to the Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data5 and called “respect for private life and medical confidentiality a fundamental right”.6
Opinion n° 13 started with a presentation of information and communication technology in health care such as the electronic health record (EHR), networking and telemedicine, the electronic health card, decision support technologies, medical databases and the Internet. It then considered the wider societal implications of ICT in health care such as changes in the practitioner / patient relationship, security and reliability in ICT systems, the citizen and standardisation, the citizen as a stakeholder, the secondary uses of personal health data, and the citizen and ownership of personal health data. This was followed by an analysis of the legal aspects.
The Opinion considered that “no specific binding legislation on personal health data and ICT exists’ at EU level and that ‘legal standards for the protection of the citizen in health care differ from country to country, since they reflect the diversity in long-standing cultural traditions, of medical secrecy, ownership of medical data, patient autonomy, professional liability, etc”.
This was followed by a discussion of the ethical aspects: The public concerns such as the difficulty of respecting privacy and confidentiality when third parties may have a strong interest in getting access to electronically recorded and stored personal health data and the value conflicts between effectiveness and confidentiality; privacy and the collective good; quality assurance and professional autonomy and between efficiency and beneficence. In addition to the legal regulations, certain ethical principles may be used to address these value conflicts, namely human dignity serving as a basis for requirements of privacy, confidentiality and medical secrecy; autonomy, serving as a basis for requirements of self-determination and participation; justice, serving as a basis for requirements of equitable distribution of limited resources; beneficence and non-maleficence, serving as a basis for the attempts to weigh anticipated benefits against foreseeable risks and solidarity, serving as a basis of the right for everyone to the protection of healthcare, with a special concern for vulnerable groups in society.
Then the EGE submitted its Opinion, ending with two so called “actions to be undertaken”. First, “a Directive on medical data protection is desirable within the framework of the current Data Protection Directive to address the particular issues arising from the use of health data in Information Society”. And second, a “European Patient’s Charter possibly by means of a Recommendation, should be adopted”.
One may wonder why the EGE explicitly pleaded for such a Charter because already in 1984 the European Parliament approved a Resolution on a European Charter on the Rights of the Patient.7 In this Resolution, the European Parliament invited the European Commission “to submit as soon as possible a proposal for a European Charter on the Rights of Patients”. Paragraph 3 of the Resolution contained 15 patients’ rights, among which the patient’s right of access to his own medical data and the right to medical confidentiality. Contrary to what had been envisaged by the European Parliament, it was not the European Commission that took the initiative for A European Charter of Patients’ Rights but the Active Citizen Network that was encouraged by the European Commission to draft such a Charter. Article 6 of the Charter proclaims the right to privacy and confidentiality: “Every individual has the right to the confidentiality of personal information, including information regarding his or her health and potential diagnostic or therapeutic procedures, as well as the protection of his or her privacy during the performance of diagnostic exams, specialist’s visits, and medical/surgical treatments in general”.8
In its Opinion n° 20 on the ethical aspects of ICT implants in the human body, the EGE discussed the legal background that should be derived from general principles underlying national legislation and international instruments. Such general principles can provide the guidance required to outline the legal standards necessary for the regulation of a technology that modifies the body and its relationship with the environment and thereby impacts deeply on personal identity and life.9
More specifically with regard to privacy and data protection, this Opinion contained an original point of view: “The view that data subjects are not free to make whatever use of their own bodies they wish is confirmed, albeit indirectly, by Article 8(2) of EC Directive 95/46 on personal data protection. Here, it is stated that States can provide that the data subject’s express consent is not enough to allow others to use his/her “sensitive data” – concerning sex life, opinions, health, ethnic origin –without an ad hoc authorisation issued, for instance, by a supervisory authority (see Section 26 of the Italian Personal Data Protection Code). This is meant to protect the most sensitive portion of the “electronic body” by preventing data subjects themselves from making available parts of their electronic bodies in such a manner as to jeopardise their integrity”.10
And the Opinion went on further as follows: “From a more general standpoint, the Charter of Fundamental Rights of the EU has drawn distinctions between the protection of private and family life (Article 7), and the protection of personal data (Article 8), which consequently has become an autonomous individual right. Thus, one has to deal with a kind of protection that is opposed to any relevant intrusion into one’s private sphere and, on the other hand, confers the right of informational self-determination on each individual – including the right to remain master of the data concerning him or her”.11
According to the Opinion, specific importance should also be attached to the principles of data minimisation, purpose specification, proportionality, and relevance. “The data minimisation principle is expressly referred to, for instance, in Article 16(2) of the French Civil Code, where it is provided that ‘il ne peut être porté atteinte à l’integrité du corps humain qu’en cas de nécessité pour la personne’ (it can only violate the integrity of the human body in the case of personal necessity). Objectively, this principle means that one should only avail oneself of a given tool if the relevant target cannot be achieved by means of less “body-intrusive” tools.
This is basically the “minimisation” principle set out in several privacy laws, such as Section 3(a) of the German Bundesdatenschutzgesetz and Section 3 of the Italian data protection code. Subjectively, the data minimisation principle postulates the existence of a personal condition that cannot be coped with unless by using a specific tool, which proves indispensable. The purpose specification principle entails the need for selecting the targets to be achieved. For instance, the Convention on Human Rights and Biomedicine provides that tests predictive of genetic diseases “may be performed only for health purposes or for scientific research linked to health purposes” (Article 12). Basically, a relationship is established between specific circumstances, available tools, and reference values. Only those tools that, within a given context, pass the consistency test with such values may be used lawfully. The proportionality principle is also grounded on the relationship between tools to be used and purposes sought. However, here emphasis is not put on the nature of the purposes in question, but on the proportionality of the tools that are used, i.e., even if the purpose as such is legitimate, it may not be pursued by using disproportionate tools. As for the relevance principle, which is expressly laid down in Article 6 of Directive 95/46, it can be taken into consideration with regard to ICT implants as well. Indeed, a given technology may be lawfully applied if it is closely and unambiguously relevant to the circumstances. This is meant to prevent excessive and/or inappropriate applications of the available tools. Ultimately, all these principles supplement one another. After identifying a legitimate purpose for using an ICT implant, one should establish whether this is actually necessary as well as whether the tools (to be) used are relevant and proportionate”.12
On 21 March 2011, the then President of the European Commission, José Manuel Barroso, asked the EGE to draft an Opinion on the ethical issues arising from the rapid expansion of information and communication technologies (ICT). President Barroso indicated that the Opinion could “offer a reference point to the Commission to promote a responsible use of the Digital Agenda for Europe and facilitate the societal acceptance of such an important policy item.”13 The Opinion dealt extensively with the then existing regulatory framework for personal data protection in the EU. But even more important was that this framework was not considered as “given” but that concerns regarding the current legal protection of personal data were also expressed.
The Opinion referred to a Eurobarometer (IP/11/742) according to which 70 % of Europeans were concerned that their personal data may be misused. In the context of ICT development there was, therefore, a widespread public perception of significant ethical risks and legal uncertainty associated notably with online activity. This is why it was time to build a stronger and more coherent data protection framework in the EU, backed by strong enforcement that would allow the digital economy to develop across the internal market, put individuals in control of their own data and reinforce legal and practical certainty for economic operators and public authorities.
A consequence of the broad and flexible concept of “personal data” was that there are numerous cases where it was not always clear whether individuals enjoyed data protection rights and whether data controllers should comply with the obligations imposed by the Directive. There are situations which involve the processing of specific information which would require additional measures under EU law e.g. key-coded data, location data, “data mining”. “Profiles”, when they are attributed to a data subject, even make it possible to generate new personal data which are not those which the data subject had communicated to the controller. This future development of”new data” (through data mining and profiling) should be taken in to account when revising the Directive.14
The EGE recommended among others that the EU secure and promote the right of access to the Internet. The European Charter of Fundamental Rights requires that everyone has the opportunity to contribute to shaping the European Society, which of course includes use of ICT. The protection of the principle of equality therefore is relevant in several domains of an individual’s life, such as education, work, commerce and health. The EGE welcomed actions by the European Commission in the ICT sector and invited the EU to actively participate and promote access to ICT in European societies, while safeguarding access to basic societal services by citizens unwilling to use ICT tools or unable to use them, by virtue of being incapacitated for technical, educational or socio-economical reasons.15
With regard to the right to privacy and data protection, the EGE asked for clarification concerning the conditions for the data subject’s consent, in order to always guarantee informed consent and ensure that the individual is fully aware that he or she is consenting to data processing and what it entails.16
The EGE also welcomed and supported the proposed revision of the EU data protection regulatory framework adopted by the Commission in January 2012. The Group underlined that during the inter-institutional debate on the proposed regulatory frame the following recommendations should be taken in to account:
– that the characteristics that qualify data as personal data be clarified, and its relevance to different ICT uses (such as IP addresses, unique RFID-numbers, geo-location data), as well as the development of new data types;
– that in the light of technological and other societal developments the existing provisions on sensitive data be reconsidered to examine whether other categories of data should be added and to further clarify the conditions for their processing;
– that individuals should be well and clearly informed, in a simple and transparent way, by data controllers about how and by whom their data are collected and processed, for what reasons, for how long and what their rights are if they want to access, rectify or delete their data;
– that in order for processing of personal data to be lawful, personal data should be processed on the basis of the explicit consent of the person concerned (including withdrawal provisions) or some other legitimate basis; that consent should be given by any appropriate method enabling a freely given specific, informed and unambiguous indication of the data subject’s wishes, ensuring that individuals are fully aware that they give their consent including the ticking of a box when visiting an Internet website and that silence or inactivity should therefore not constitute consent;
– that consent may always be withdrawn without negative consequences for the data subject and that data subjects should have the right to require that their personal data be erased and that there will be no further processing of the data, that in principle data previously analysed must be deleted unless retention can be justified and that informed consent procedures should clarify the conditions when withdrawal is not feasible;
– that children and vulnerable adults deserve specific protection of their personal data, as they may be less aware of risks, consequences, safeguards and their rights in relation to the processing of personal data;
– that the right to deletion of personal data should be extended in such a way that any publicly available copies or replications should be deleted, and
– that the processing of personal data of subjects residing in the EU by a controller not established in the EU/EEA is subjected to the EU normative frame on data protection.17
In this Opinion, the EGE considered that the concept of data protection is of far more recent vintage than privacy, essentially finding its genesis in the increasing collection of personal data about individuals by government. The advent of computers and then of the Internet, greatly spurred on the development of the concept of data protection. The core concept behind data protection is that individuals have a right to control the collection and use of data through which they may be identified (personal data). Like privacy, data protection is subject to certain constraints, of which an obvious one is police investigations into crime. Data protection may be contrasted with privacy inasmuch as the core notions underpinning it are fairly clear and garner wide consensus, albeit with some important variations. While the European Court of Human Rights has dealt with the protection of personal data as an integral part of the right to privacy, at EU level the right to data protection is seen as an autonomous right. Personal data are protected by the law even if the right to privacy is not at stake. Article 8 of the Charter for Fundamental Rights unambiguously states that “everyone has the right to the protection of their personal data”. Data protection is both broader and more specific than the right to privacy since it does not only aim at concretising the protection of privacy, but simply applies every time personal data are processed.18
The EGE signalled loopholes in the existing regulatory framework primarily in the field of implementing privacy, balancing privacy against security and introducing governance schemes in the area of surveillance, including drones. As regulation in the area of surveillance was scarce – also in an EU context – it should be considered whether more regulation or other forms of governance would be appropriate. The current legal systems (in Europe and elsewhere) were not designed for contemporary techniques of surveillance. This had as a consequence that the regulation was not coherent and that a number of problems remained unsolved. This was obvious when the global situation was taken in to account, but also to some extent covered the EU situation. The national regulations seemed to have the same problem, not being geared for the new technologies and uses and regional and global solutions were missing. The topic was, however, on the agenda in many countries and thus it might have been timely to propose regulation in the area.19
In its Recommendations, the EGE affirmed that the purpose limitation principle as regards personal data be the standard for both public and private organisations. Personal data should only be collected for a specific and legitimate purpose. As far as possible data should be anonymised and greater use should be made of encryption which can serve to enhance both privacy and security. Data sharing by default is to be avoided and users should be allowed to control (e.g. through access to privacy settings) and change information held by organisations about them. Profiling of individuals for commercial purposes should be subject to the individual’s explicit consent. Information should be available by commercial organisations in relation to what data are going to be collected, by whom, for what purpose, for how long and if data collected will be linked with other data sources.20
The EGE also noted the shift towards collection and correlation of large datasets, the so called “big data”. While the EGE recognised the potential value of such datasets, it was concerned that without proper attention, the principle of purpose limitation at the core of data protection would be undermined. Thus, the EGE urged public authorities and private organisations to engage in purposeful ethical inquiry to inform and align their actions with shared European values of dignity, privacy and autonomy. The EGE recommended that the EU develop a code of conduct for big data analytics that would guide organisations with the process.21
The EGE was of the view that the protection of data enshrined in EU law is robust but required to be enforced at the national level. Member States should therefore ensure that data protection authorities have sufficient legal powers, technical expertise and resources to ensure effective levels of enforcement across the European Union.22
According to the EGE, one of the most important new regulatory challenges relates to the large-scale collection of data or “big data”. The adequacy of traditional forms of consent and their applicability to the collection and use of big data have become the focus of attention. The EGE addressed this topic in its Opinion n° 26 on the ethics of Information and Communication Technologies. In Opinion n° 29, the EGE referred only to some specific emerging issues related to big health data. It would seem that only the broad concept of consent is applicable in the use of big data, which entails asking individuals transparently to consent not only to the immediate purpose for which their data has been collected, but also to unforeseen uses of their data (in so far as new possible uses really are unforeseen). One alternative solution is offered by so-called “enhanced consent”, which aims to enhance privacy, based on the awareness of the personal and social significance of anonymised (individual patient and personal) data for preventive and predictive purposes in healthcare, and for promoting “data donation”.
