Data Privacy and GDPR Handbook - Sanjay Sharma - E-Book

Data Privacy and GDPR Handbook E-Book

Sanjay Sharma

0,0
51,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

The definitive guide for ensuring data privacy and GDPR compliance Privacy regulation is increasingly rigorous around the world and has become a serious concern for senior management of companies regardless of industry, size, scope, and geographic area. The Global Data Protection Regulation (GDPR) imposes complex, elaborate, and stringent requirements for any organization or individuals conducting business in the European Union (EU) and the European Economic Area (EEA)--while also addressing the export of personal data outside of the EU and EEA. This recently-enacted law allows the imposition of fines of up to 5% of global revenue for privacy and data protection violations. Despite the massive potential for steep fines and regulatory penalties, there is a distressing lack of awareness of the GDPR within the business community. A recent survey conducted in the UK suggests that only 40% of firms are even aware of the new law and their responsibilities to maintain compliance. The Data Privacy and GDPR Handbook helps organizations strictly adhere to data privacy laws in the EU, the USA, and governments around the world. This authoritative and comprehensive guide includes the history and foundation of data privacy, the framework for ensuring data privacy across major global jurisdictions, a detailed framework for complying with the GDPR, and perspectives on the future of data collection and privacy practices. * Comply with the latest data privacy regulations in the EU, EEA, US, and others * Avoid hefty fines, damage to your reputation, and losing your customers * Keep pace with the latest privacy policies, guidelines, and legislation * Understand the framework necessary to ensure data privacy today and gain insights on future privacy practices The Data Privacy and GDPR Handbook is an indispensable resource for Chief Data Officers, Chief Technology Officers, legal counsel, C-Level Executives, regulators and legislators, data privacy consultants, compliance officers, and audit managers.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 973

Veröffentlichungsjahr: 2019

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Data Privacy and GDPR Handbook

Sanjay Sharma, PhD

with research associatePranav Menon

Cover design: Prudence Makhura

Cover image: © Starline / Freepik.com

Cover painting: Om Prakash (1932-2019)

Copyright © 2020 by Sanjay Sharma. All rights reserved.

Published by John Wiley & Sons, Inc., Hoboken, New Jersey.

Published simultaneously in Canada.

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 646-8600, or on the Web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at www.wiley.com/go/permissions.

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993, or fax (317) 572-4002.

Wiley publishes in a variety of print and electronic formats and by print-on-demand. Some material included with standard print versions of this book may not be included in e-books or in print-on-demand. If this book refers to media such as a CD or DVD that is not included in the version you purchased, you may download this material at http://booksupport.wiley.com. For more information about Wiley products, visit www.wiley.com.

Library of Congress Cataloging-in-Publication Data:

ISBN 978-1-119-59424-6 (cloth)ISBN 978-1-119-59425-3 (ePDF)ISBN 978-1-119-59419-2 (ePub)ISBN 978-1-119-59430-7 (obk)

To my family and friends

CONTENTS

Cover

1 Origins and Concepts of Data Privacy

1.1 Questions and Challenges of Data Privacy

1.2 The Conundrum of Voluntary Information

1.3 What Is Data Privacy?

1.4 Doctrine of Information Privacy

1.5 Notice-and-Choice versus Privacy-as-Trust

1.6 Notice-and-Choice in the US

1.7 Enforcement of Notice-and-Choice Privacy Laws

1.8 Privacy-as-Trust: An Alternative Model

7

1.9 Applying Privacy-as-Trust in Practice: The US Federal Trade Commission

1.10 Additional Challenges in the Era of Big Data and Social Robots

1.11 The General Data Protection Regulation (GDPR)

1.12 Chapter Overview

Notes

2 A Brief History of Data Privacy

2.1 Privacy as One’s Castle

2.2 Extending Beyond the “Castle”

2.3 Formation of Privacy Tort Laws

2.4 The Roots of Privacy in Europe and the Commonwealth

2.5 Privacy Encroachment in the Digital Age

2.6 The Gramm-Leach-Bliley Act Tilted the Dynamic against Privacy

2.7 Emergence of Economic Value of Individual Data for Digital Businesses

2.8 Legislative Initiatives to Protect Individuals’ Data Privacy

2.9 The EU Path

2.10 End of the Wild West?

2.11 Data as an Extension of Personal Privacy

2.12 Cambridge Analytica: A Step Too Far

2.13 The Context of Privacy in Law Enforcement

Summary

Notes

3 GDPR’s Scope of Application

3.1 When Does GDPR Apply?

3.2 The Key Players under GDPR

3.3 Territorial Scope of GDPR

3.4 Operation of Public International Law

Notes

4 Technical and Organizational Requirements under GDPR

4.1 Accountability

4.2 The Data Controller

4.3 Technical and Organizational Measures

4.4 Duty to Maintain Records of Processing Activities

4.5 Data Protection Impact Assessments

4.6 The Data Protection Officer

4.7 Data Protection by Design and Default

4.8 Data Security during Processing

4.9 Personal Data Breaches

4.10 Codes of Conduct and Certifications

4.11 The Data Processor

Notes

5 Material Requisites for Processing under GDPR

5.1 The Central Principles of Processing

5.2 Legal Grounds for Data Processing

5.3 International Data Transfers

5.4 Intragroup Processing Privileges

5.5 Cooperation Obligation on EU Bodies

5.6 Foreign Law in Conflict with GDPR

Notes

6 Data Subjects’ Rights

6.1 The Controller’s Duty of Transparency

6.2 The Digital Miranda Rights

6.3 The Right of Access

6.4 Right of Rectification

6.5 Right of Erasure

6.6 Right to Restriction

6.7 Right to Data Portability

6.8 Rights Relating to Automated Decision Making

6.9 Restrictions on Data Subject Rights

Notes

7 GDPR Enforcement

7.1 In-House Mechanisms

7.2 Data Subject Representation

7.3 The Supervisory Authorities

7.4 Judicial Remedies

7.5 Alternate Dispute Resolution

7.6 Forum Selection Clauses

7.7 Challenging the Existing Law

Notes

8 Remedies

8.1 Allocating Liability

8.2 Compensation

8.3 Administrative Fines

8.4 Processing Injunctions

8.5 Specific Performance

Notes

9 Governmental Use of Data

9.1 Member State Legislations

9.2 Processing in the “Public Interest”

9.3 Public Interest and the Rights of a Data Subject

9.4 Organizational Exemptions and Responsibilities

9.5 Public Documents and Data

9.6 Archiving

9.7 Handling Government Subpoenas

9.8 Public Interest Restrictions on GDPR

9.9 Processing and Freedom of Information and Expression

9.10 State Use of Encrypted Data

9.11 Employee Data Protection

Notes

10 Creating a GDPR Compliance Department

10.1 Step 1: Establish a “Point Person”

10.2 Step 2: Internal Data Audit

10.3 Step 3: Budgeting

10.4 Step 4: Levels of Compliance Needed

10.5 Step 5: Sizing Up the Compliance Department

10.6 Step 6: Curating the Department to Your Needs

10.7 Step 7: Bring Processor Partners into Compliance

10.8 Step 8: Bring Affiliates into Compliance

10.9 Step 9: The Security of Processing

10.10 Step 10: Revamping Confidentiality Procedures

10.11 Step 11: Record Keeping

10.12 Step 12: Educate Employees on New Protocols

10.13 Step 13: Privacy Policies and User Consent

10.14 Step 14: Get Certified

10.15 Step 15: Plan for the Worst Case Scenario

10.16 Conclusion

Notes

11 Facebook: A Perennial Abuser of Data Privacy

11.1 Social Networking as an Explosive Global Phenomenon

11.2 Facebook Is Being Disparaged for Its Data Privacy Practices

11.3 Facebook Has Consistently Been in Violation of GDPR Standards

11.4 The Charges against Facebook

11.5 What Is Facebook?

11.6 A Network within the Social Network

11.7 No Shortage of “Code of Conduct” Policies

11.8 Indisputable Ownership of Online Human Interaction

11.9 Social Networking as a Mission

11.10 Underlying Business Model

11.11 The Apex of Sharing and Customizability

11.12 Bundling of Privacy Policies

11.13 Covering All Privacy Policy Bases

11.14 Claims of Philanthropy

11.15 Mechanisms for Personal Data Collection

11.16 Advertising: The Big Revenue Kahuna

11.17 And Then There Is Direct Marketing

11.18 Our Big (Advertiser) Brother

11.19 A Method to Snooping on Our Clicks

11.20 What Do We Control (or Think We Do)?

11.21 Even Our Notifications Can Produce Revenue

11.22 Extent of Data Sharing

11.23 Unlike Celebrities, We Endorse without Compensation

11.24 Whatever Happened to Trust

11.25 And to Security of How We Live

11.26 Who Is Responsible for Security of Our Life Data?

11.27 And Then There Were More

11.28 Who Is Responsible for Content?

11.29 Why Should Content Be Moderated?

11.30 There Are Community Standards

11.31 Process for Content Moderation

11.32 Prospective Content Moderation “Supreme Court”

11.33 Working with Governmental Regimes

11.34 “Live” Censorship

11.35 Disinformation and “Fake” News

11.36 Conclusion

Notes

12 Facebook and GDPR

12.1 The Lead Supervisory Authority

12.2 Facebook nicht spricht Deutsch

12.3 Where Is the Beef? Fulfilling the Information Obligation

12.4 Data Processing

Purpose Limitation

12.5 Legitimate Interests Commercial “Restraint” Needed

12.6 Privacy by Design?

12.7 Public Endorsement of Personalized Shopping

12.8 Customizing Data Protection

12.9 User Rights versus Facebook’s Obligations

12.10 A Digital Blueprint and a GDPR Loophole

12.11 Investigations Ahead

12.12 Future Projects

Notes

13 The Future of Data Privacy

13.1 Our Second Brain

13.2 Utopian or Dystopian?

13.3 Digital Empowerment: Leveling the Playing Field

Notes

Appendix Compendium of Data Breaches

2014

2015

2016

2017

2018

Notes

About the Authors

Index

End User License Agreement

List of Illustrations

Chapter 4

Figure 4.1 The Hierarchy of Data Protection

Chapter 5

Figure 5.1 Principles of Processing, Breakdown

Figure 5.2 Special Categories of Data

Chapter 6

Figure 6.1 Balancing the Right to Object

Chapter 9

Figure 9.1 Treatment of Data Subjects’ Rights in Processing Carried Out under Official Aut...

Figure 9.2 Hierarchy of GDPR Objectives

Guide

Cover

Table of Contents

Chapter 1

Pages

i

ii

iii

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

274

275

276

277

278

279

280

281

282

283

284

285

287

288

289

290

291

292

293

294

295

296

297

298

299

300

301

302

303

304

305

306

307

308

309

310

311

312

313

314

315

316

317

319

320

321

322

323

324

325

326

327

328

329

330

331

332

333

335

336

337

338

339

340

341

342

343

344

345

346

347

348

349

350

351

352

353

354

355

356

357

358

359

360

361

362

363

364

365

366

367

368

369

370

371

372

373

374

375

376

377

378

379

380

381

382

383

384

385

386

387

388

389

390

391

392

393

394

395

396

397

398

399

400

401

402

403

404

405

407

408

409

410

411

412

413

414

415

416

417

418

419

420

421

422

423

424

425

426

427

428

429

430

431

432

433

434

435

436

437

438

439

440

441

442

443

444

445

446

447

448

449

450

451

452

453

454

455

456

457

458

459

460

461

462

463

464

465

466

467

468

469

470

471

472

473

474

475

476

477

478

479

480

481

1Origins and Concepts of Data Privacy

Privacy is not something that I’m merely entitled to, it’s an absolute prerequisite.

— Marlon Brando

We generate enormous amounts of personal data and give it away without caring about our privacy.

Before the wake-up alarm rings on our smartphone, our heartbeats and sleeping patterns were being recorded through the night on the embedded app on our wrist watch. We turn on our customized morning playlist on Spotify, read the headlines tailored for our interests on Apple or Google news, retweet on Twitter, upvote on Quora, register likes on WhatsApp, post a snapshot of the snow outside our window, and look up on what our friends are up to on Facebook. We then check the weather forecast and ask Alexa to order cereal from Amazon. We are ready to go to work.

Unimaginable convenience for us commoners without a royal butler feels splendid. The invisible cost is that we are under constant surveillance whenever we use these services. All our choices, actions, and activities are being recorded and stored by the seemingly free technology-driven conveniences.

When we take an Uber or Lyft to work, our location and destination are known to them from previous trips. Today’s journey is also recorded, including the name of the driver and how we behaved – spilling coffee may show up on our passenger rating if the driver notices it. A smile and thank-you wave to the driver are worth five rating stars. Our choice of coffee at Starbucks may already be programmed and ready based on our past preferences. Each swipe of our credit card is imprinted into our purchase habits.

As we exit the car, a scarcely visible street camera is recording our movements and storing those records for the local city police. The recording of our actions continue as we turn on our computer at work. We read and respond to e-mails, order lunch online, attend video conference calls, and check on family and friends again. Before noon, we have generated innumerable data on our laptops, tablets, phones, and wearables – with or without our conscious cognition or permission.

Everything that we touch through the make-believe cocoon of our computer, tablet, or smartphone leaves a digital trail. Records of our actions are used as revenue sources by data-gobbling observers in the guise of learning and constant improvement. In a different era, this level of voluntary access into our daily lives would have thrilled secret service organizations.

Numerous questions are raised in this fast-evolving paradigm of convenience at no cost: Whose data is it? Who has the rights to sell it? What is the value of the information that we are generating? Can it be shared by the Data Collectors, and, if so, under what circumstances? Could it be used for surveillance, revenue generation, hacking into our accounts, or merely for eavesdropping on our conversations? And, most importantly, can it be used to influence our thinking, decisions, and buying behavior?

Concerns regarding the privacy of our data are growing with advances in technology, social networking frameworks, and societal norms. This book provides a discourse on questions surrounding individual rights and privacy of personal data. It is intended to contribute to the debate on the importance of privacy and protection of individuals’ information from commercialization, theft, public disclosure, and, most importantly, its subliminal and undue influence on our decisions.

This book is organized across three areas: we first introduce the concept of data privacy, situating its underlying assumptions and challenges within a historical context; we then describe the framework and a systematic guide for the General Data Protection Regulations (GDPR) for individual businesses and organizations, including a practical guide for practitioners and unresolved questions; the third area focuses on Facebook, its abuses of personal data, corrective actions, and compliance with GDPR.

1.1 Questions and Challenges of Data Privacy

We illustrate the questions and challenges surrounding individual rights and privacy of personal data by exploring online dating and relationship-seeking apps such as match.com, eHarmony, and OK Cupid. To search for compatible relationships through these apps, users create their profiles by voluntarily providing personal information, including their name, age, gender, and location, as well as other character traits such as religious beliefs, sexual orientation, etc. These apps deploy sophisticated algorithms to run individuals’ profiles to search for suitable matches for dating and compatible relationships.

Online dating apps and platforms are now a global industry with over $2 billion in revenue and an estimated 8,000 sites worldwide. These include 25 apps for mainstream users, while others cater to unique profiles, special interests, and geographic locations. The general acceptance of dating sites is significant – approximately 40% of the applicable US population use dating sites, and it is estimated that half of British singles do not ask someone for a date in person. The industry continues to evolve and grow, with around 1,000 apps and websites being launched every year in the US alone.

Most dating sites and apps do not charge a fee for creating user profiles, uploading photos, and searching for matches. The convenience of these apps to users is manifold. They can search through the universe of other relationship-seekers across numerous criteria without incurring the costs and time for the initial exchange of information through in-person meetings. More importantly, dating apps lower the probability of aspirational disappointment if there was disinterest from their dates.

1.1.1 But Cupid Turned Out to Be Not OK

In May 2016, several Danish researchers caused an outrage by publishing data on 70,000 users of the matchmaking/dating site OK Cupid. Clearly, the researchers had violated OK Cupid’s terms of use. The researchers’ perspective was that this information was not private to begin with. Their justification for not anonymizing the data was that users had provided it voluntarily by answering numerous questions about themselves. By registering on the dating service, the users’ motivation was to be “discovered” as individuals through a selection process by application of the matching algorithm. The information was available to all other OK Cupid members. The researchers argued that it should have been apparent to the users that other relationship-seekers and thus the general public could access their information – with some effort, anyone could have guessed their identities from the OK Cupid database.

This case raises the following legal and ethical questions:

Were the researchers and OK Cupid within their rights to conduct research on data that would be considered as private by the users?

Did the researchers have the obligation to seek the consent of OK Cupid users for the use of their personal information?

Was it the obligation of OK Cupid to prevent the release of data for purposes other than dating?

If a legal judgment were to be made in favor of the users, how could the monetary damages be estimated?

What should a legal construct look like to prevent the use of personal data for purposes different from that which is provided by the users?

If users’ information in the possession of and stored by OK Cupid was illegally obtained and sold or otherwise made public, who is liable?

1.2 The Conundrum of Voluntary Information

As humans, we have an innate desire to share information. At the same time, we also want to be left alone – or at least have the autonomy and control to choose when and with whom we want to share information. We may disrobe in front of medical professionals, but it would be unthinkable in any other professional situation. Similarly, we share our tax returns with our financial advisors but otherwise guard them with our lives. We share our private information personally and professionally in specific contexts and with a level of trust.

This phenomenon is not new but takes on a different dimension when our lives are inextricably intertwined with the internet, mobile phone connectivity, and social networks. With the ease of information dissemination through the internet, anyone with a computer or a mobile phone has become a virtual publisher – identifiable or anonymous. The internet provides near-complete autonomy of individual expression and effortless interactions with commercial services to bring tremendous convenience to our daily lives. At the same time, our expectations of control over our privacy have become increasingly overwhelmed by the power of commercial interests to collect our personal data, track our activities, and, most alarming, to subliminally influence our thoughts and actions. The growing power of commercial and other nefarious interests to impact our lives would have been considered dystopian not too long ago.

We generally understand that once we voluntarily share information with someone else, we lose control over how it can be used. However, two questions remain unanswered: Do we truly realize the extent to which our personal data is being monitored? What level of control and rights do we have over our personal information that is generated through our activities and involuntarily disclosed by us? As an example, mapping our driving routes to avoid traffic jams or ordering a taxicab to our location through an app on our mobile phones has become indispensable. This capability requires that our mobile phones act as monitoring devices and record our every movement with technological sophistication that would make conventional surveillance mechanisms look quaint. However, we would chafe at the notion of being asked to carry a monitoring device in the context of law enforcement, societal surveillance, or even as part of a research project.

The mechanisms for sharing information and their abuse are exponentially greater than in the days of print journalism and the school yearbook. Fast-evolving technology platforms are making our lives efficient and convenient, but these technologies require us to share personal information. Entities that receive and collect our data can use it to foster their commercial and sometimes nefarious interests. Our personal data can be abused through a multitude of ways that are becoming easier to execute – making it more profitable for commercial interests and more effective for law enforcement.

We need rigorous regulatory and legal mechanisms to govern how our information is used, regardless of whether it is provided voluntarily or otherwise. However, this is a very hard challenge because artificial intelligence and big data technology frameworks are constantly and rapidly evolving and can be easily mutated to circumvent regulations. Lawmakers are increasingly recognizing and adapting to these realities by laying the groundwork for legal frameworks to protect our privacy. Their challenge is that regulations for protecting individuals’ data privacy should foster technology-driven personal convenience and not stifle ethical commercial activities and interests.

1.3 What Is Data Privacy?

1.3.1 Physical Privacy

Data privacy as a concept did not exist until the late twentieth century, with the birth of the internet and its exponential rate of adoption through computers and mobile phones. Until that time, privacy largely applied to physical existence and information as it related to an individual,1 his home,2 documents,3 and personal life. The concept of privacy comes from a Western school of thought and had blossomed through common law, having its first roots in defenses against state action and privacy torts. Conflicts in this construct had mainly arisen in matters relating to journalism and state encroachment into the private life of citizens.

But how would the right to be left alone doctrine fare in a world where people willingly share private information in the public domain? How would the privacy of correspondence apply when documents are intangible, and conversations can be observed by hundreds of our friends? Is data an extension of ourselves and our private lives, or is it a commodity to be exchanged in a contract?

1.3.2 Social Privacy Norms

The traditional concept of privacy is centered around shielding ourselves and our activities from outsiders. It has the notion of secrecy. We associate personal privacy with “get off my yard” or “closing the blinds of our homes” to prevent outsiders from looking in. In business settings, privacy is associated with discussions and decisions “behind closed doors.”

However, we readily disrobe behind a flimsy curtain in a clothing store without doubting if there is a secret camera. We hand over our suitcases for security inspection; we provide our Social Security numbers over the phone to our bank or insurance providers without asking for our rights to privacy. We may join a discussion group or a loyalty program and freely express our views. Concerns regarding our privacy hardly ever prevent us from providing our most intimate information to strangers.

In this construct, the roots and norms of privacy are based on social frameworks. The boundary of sharing information rests on who we have a relationship with (formal or informal) and who we trust. This implies a fiduciary responsibility from the individuals with whom we have shared the information; e.g. we trust that banks, security personnel, health insurers, etc., will not share our data with anyone without our explicit permission. Across all these situations, sharing is necessary and our trust in information receivers is inherent, but we provide it in specific contexts.

1.3.3 Privacy in a Technology-Driven Society

As technologies evolve, creating boundaries in the current societal environment is not an easy task by any means. We must think expansively to create a framework where the release, sharing, and use of our information is transparent, and discretion over it can be managed in our daily lives. It is relatively straightforward to create and enforce laws against premeditated and illegal use of our privacy or personal data, e.g. a hacker extracting confidential data through cyber intrusion – a clearly criminal activity akin to physical intrusion and theft.

This gets trickier when our private personal data may be used for public research (e.g. OK Cupid) or for targeted advertising. In addition, liability and assessment of damages is an uncharted territory for misuse when the underlying personal harm is nonmonetary and the question of liability attribution is unclear. This also applies to the transfer and sale of our personal data collected by apps and internet service providers. This becomes more complicated when it concerns the mining and collection of data that we have provided inconspicuously through our browsing – what we view when we buy, who we are likely to vote for, and who we may find to love. Abuses such as these have sparked the growth of the Doctrine of Information Privacy or “Data Privacy” in the modern age as an evolution to the traditional constructs of privacy in a “physical” or nondigital society.

1.4 Doctrine of Information Privacy

The use and mining of our personal data have existed from the time the first census was conducted. Researchers have used personal data for ages, but by and large without a commercial motive. With the advent of the internet and mobile technology, the pace and volume of personal data collection have grown exponentially. At the same time, it has become enormously valuable and is even traded in secondary markets like a commodity.

1.4.1 Information Sharing Empowers the Recipient

Through the disclosure and sharing of personal information, we intrinsically empower its recipients. This is most visible in doctor-patient (particularly for psychiatric conditions) and attorney-client information sharing. In journalism, it is a well-established and understood norm that “off the record” conversations are not attributed to the provider of information or commentary.

We understand this and exercise our contextual discretion by limiting the sharing of our professional compensation with our close family, supervisors, and human resources departments, and not always with our friends or work colleagues. We do not allow medical professionals to share our health information with our accountants or vice versa.

We have always cherished our rights and discretion privileges to limit the sharing of our personal information. Yet we continually provide information over the internet through mouse clicks and swipes and allow its unfettered usage.

1.4.2 Monetary Value of Individual Privacy

Across both our physical and digital existence, our right to our personal data and privacy is essential for our individuality and ownership of our thoughts and emotions. Historically, laws have considered health care, financial information, including tax filings, and other records to have enforceable rights to privacy. Ideally, these rights should extend to any form of data – even if it is seemingly innocuous. This includes data regarding our movements, events, and buying behavior.

The construct of intrusion of physical, property-based information has become the generally accepted construct of privacy. This is not entirely misplaced or ineffective. However, it can be argued that principles of privacy intrusion based on physical space can actually harm the right to privacy. This is because the decline of personal information as a property right raises the question: what is the monetary value of an individual’s or a group’s collective value of personal information? For instance, consider the case of US airline JetBlue Airways, wherein the company had shared some of its customers’ information with a third party; a federal court rejected a breach of contract claim. The customers’ case was that JetBlue had violated the obligations stated in its privacy policy. The court stated that even if it was assumed that a privacy policy could be interpreted as a contract, JetBlue’s customers could not identify the damages and thus there was no support for the proposition that their personal information had any value. This can be significantly constraining in developing an effective legal framework to protect our data privacy.

1.4.3 “Digital Public Spaces”

The construct of intrusion of privacy in public spaces by traditional media – photographs and news stories citing individuals’ specific traits, behavior, or life events − does not always extend to cyberspace. In the absence of monetizability of damages, judicial systems and policy makers tend to consider data privacy less worthy of legal protection than similar intrusions of physical space. In the case of cyber harassment, online intrusions of privacy, blatant theft, and even attacks are viewed as eminently preventable ex ante or stopped after the fact by shutting down a personal account or a webservice.

The most significant limitation of the construct of physical privacy is the implied definition of “digital public spaces.” Individuals’ rights to the privacy of their data should be applicable irrespective of the means of its acquisition or storage location. Privacy rights should not be conditioned by where individual data is stored. Privacy applies to the information and not where it resides or is derived from. This has direct implications for big data and machine learning techniques that isolate and predict our behavior based on collective data – that in the physical sense – is analogous to a public space.

Individuals provide data to retailers and other service providers as a necessity by virtue of their usage. This is unavoidable. The construct of privacy as seclusion from the public domain would imply two things – first, that the individual data provider has released the data into the public domain and has anonymized that data; second, that the distinction between public and private space in the digital domain cannot be well defined.

The perfect framework to regulate data privacy should enable us to control what, why, when, and with whom we share information, and how it will be used. This framework should allow us to revoke the continued usage of information through a collective choice or specifically for each entity. There should not be normative judgments regarding which data is important, or the context in which it is disclosed. The right to privacy is an individual choice including how, whether, and when anyone can use an individual’s information that may be voluntarily provided or extracted.

1.4.4 A Model Data Economy

We have to create an environment where information willingly provided by us or extracted through our activities is not exploited for commercial or nefarious purposes without our thorough understanding and express time-bound permission determined by us. In addition, information that we consider truly private should not be released, re-created, or deconstructed. Researchers, governmental bodies, and businesses – be they social networks, search engines, or online advertisers – cannot use individual data under the legal representation that it is voluntarily provided or can be (inconsiderately) accessed through public sources.

A societal and legal framework for privacy should not encourage individual withdrawal from making connections and interacting with others. Rather, it should be designed to enable us to govern our private and public existence and contextual disclosure and usage of our private information. It should prevent any framework or mechanism from manipulating us into disclosing more information than we intend to, and once disclosed, to prevent its use in ways that may not have been represented to us ex ante. This framework must be legally enforceable with penalties for knowingly or otherwise violating the law or guidelines in any form.

Creating a legal framework for protecting the privacy of our personal information is a daunting task. While we must share our information for technologies, businesses, and societies to flourish and governments to function, we should also be aware of the collection of our data and its usage. As new information is being revealed about how Facebook provided access to user data, it is becoming shockingly apparent how providers can abuse data, and the extent to which they can manipulate our thinking and decision making.

For our social structures to persist and global commerce to thrive, we must trust collectively created frameworks in which there are legal standards to prevent prohibited or cavalier use of our information and with associated liabilities for its abuse. At the very least, this would encourage trusting relationships between providers and users of our personal data. This is indeed a momentous task that requires thoughtful and comprehensive laws through the participation of legal and social scholars, legislatures, and governmental and regulatory bodies.

With fast-evolving technology and the internet of things (wherein our physical beings and surroundings are wired and connected with transmitters) is around the corner, societies face a collective choice. We cannot let our rights to privacy be squandered away for the sake of convenience. A fine line has to be drawn between laws that are so onerous that they impede commerce and our own conveniences and those that guard against our privacy and exploitation of our likes, habits, and thoughts.

1.5 Notice-and-Choice versus Privacy-as-Trust

Notice-and-choice is based on the legal doctrine that as long as a data-collecting entity provides notice and discloses the specificity of the data they collect from a subscriber of the service, and how it will be used, we as data providers have sufficient information and discretion ex ante to make our choice/consent as to whether or not to interact and provide our information. This construct is inadequate because in our day-to-day lives, our information sharing is selective and contextual, and applies differentially. In addition, it is impractical for us to study a long disclaimer and terms of engagement with the entity that is collecting our information every time we click “I agree.” There are several other reasons why this construct is inadequate.

The bottom-line for our innate human trait to share information is that our actions to do so are contextual and based on trust. From a legal perspective, the paradigm of trust is based on a time-tested model of fiduciary law wherein the personal data-collecting entity is innately powerful once it has collected the data, making one vulnerable to the other; the entity with more power or control is legally required to act in the vulnerable party’s best interest. Once again, the doctor-patient relationship is a classic example.

A construct of trust between providers and users of personal data could serve as a foundational component for design and enforcement of regulation. However, the concept of trust is hard to govern and enforce in practice. This is because our information has enormous economic value that would inevitably lead to its abuse by its collectors, intermediaries, and other agents in the process. The construct in which we have indelible trust in the data receiver and the aggregator will only be achieved when there is an “inform-and-consent” framework that is in place with strong deterrence for breach of trust.

1.6 Notice-and-Choice in the US

In the US, the notice-and-choice legal construct has a long history. The Fair Information Practices Principles (FIPPs), developed from a 1973 report by the US Department of Housing, Education and Welfare (HEW), are the foundation of notice-and-choice. Since such government agencies are privy to extensive personal data, HEW recommended that the agencies be required to make their data-use practices public, i.e. provide “notice.” Thus, in theory, individuals may or may not consent to those agencies using or sharing that data.

The Federal Trade Commission (FTC) brought their recommendation of notice to the US Congress, emphasizing its importance as a component of FIPP. Since then, notice has been the framework for how legal obligations are placed upon companies, particularly online. There is no comprehensive federal law in place, however, that codifies the FTC’s recommendations from the 1973 FIPPs report. Laws vary across states and industry sectors and are thus frequently modified. In contrast, the EU and Canada have more comprehensive laws in existence.

One of the most important and widely enforced example of sector-specific statutes is the Health Information Portability and Accountability Act (HIPAA), which protects users’ medical and healthcare information. The Gramm-Leach-Bliley act is similar with respect to the financial sector. The statute that regulates activity specific to the internet is the Children’s Online Privacy Protection Act (COPPA), which prohibits unauthorized use, collection, and dissemination of information of children 13 years old and younger, among other protections afforded to them. Most if not all of these acts deal with notice as their basis, but not necessarily protection of individual data privacy.

In the US, states’ attorney generals have pressed for notice-and-choice along with the FTC. The California Online Privacy Protection Act (CalOPPA) was the first state law to require commercial websites to provide their users in the state with privacy disclosures. These disclosures include, generally, what information is collected, with whom it might be shared, and how users will be notified about the company’s data-use practices. Similarly, in 2003, California enacted the “Shine the Light” law, which allows residents to obtain information from companies regarding their personal information that has been shared with third parties including agencies.

In New York, the Internet Security and Privacy Act also requires state agencies to provide the “what-when-how” of their own data-use policies. The provisions are essentially identical to California’s trailblazing laws except that they are applied to New York’s state agencies’ websites. Connecticut and Michigan have similar frameworks but they apply to any person or entity that files a person’s Social Security number. In Utah, the Government Internet Information Privacy Act requires notice of the what-when-how as well. Some examples of these what-when-hows (varying across states) of notice requirements on commercial and government websites are:

Statement(s) of any information the entity will collect

How the information is collected

The circumstances under which such collected information will be disclosed to the user

A description of the process by which the operator notifies of changes to the privacy policy to the user

Whether and what information will be retained

The procedures by which a user may gain access to the collected information

1.7 Enforcement of Notice-and-Choice Privacy Laws

The Federal Trade Commission has brought action against entities that it contends did not comply with applicable privacy laws. In the following cases, the company did not provide adequate notice of their data-use practices. Once again, the FTC and the corporate entities centered their complaints and settlements around the idea of notice. These can be referred to as “broken promises” actions.

1.7.1 Broken Trust and FTC Enforcement

In 2002 Eli Lilly and Company (Lilly) agreed, per the FTC website: “to settle FTC charges regarding the unauthorized disclosure of sensitive personal information collection from consumers through its Prozac.com website. As part of the settlement, Lilly will take appropriate security measures to protect consumers’ privacy.”4

Eli Lilly allowed users of Prozac.com to sign up for e-mail alerts reminding them to take and/or refill their prescriptions. The e-mails were personalized by data entered by each user. In 2001, a Lilly employee sent a memo to its users alerting them that the service would be discontinued. The “To:” line of that message included all 669 of the users’ e-mail addresses, therefore making the users’ medical information public. The FTC’s complaint alleges that “Lilly’s claim of privacy and confidentiality was deceptive because Lilly failed to maintain or implement internal measures appropriate … to protect sensitive consumer information.”5 Lilly settled with orders to comply with notice of their data-use practices.

More examples include a case in which the FTC alleged that GeoCities – an internet web-based service expressly violated their own privacy policy by selling their customers’ personal information. GeoCities settled and was required to comply with notice-and-choice guidelines. The FTC also took action against Frostwire, LLC, alleging that the company misled their customers into believing that certain files would not be accessible to the public, but they actually were, and that Frostwire failed to explain how the software worked. Lastly, in a case against Sony BMG Entertainment, Sony did not notify their customers that software installed on certain CDs could transmit users’ music-listening data back to Sony. Once again, the company settled and was ordered to comply with notice-and-choice-style privacy practices in the future.

1.7.2 The Notice-and-Choice Model Falls Short

In theory, the notice-and-choice model assumes that if a data-collecting entity provides all the information required to inform users regarding the potential use of their personal data, they can freely make their own autonomous decisions regarding their privacy. It is based on the ideals of internet pioneers and cyberlibertarians (advocates for the use of technology as a means of promoting individual or decentralized initiatives, and less dependence on central governments).

However, notice-and-choice as a model for the law is inadequate because of several factors. First, the notion of autonomous decision making by an internet user has not turned out to be effective in practice. Second, the idea that users could remain fully anonymous has now been proven false. Most aspects of our lives are monitored; our activities are tracked and recorded. Our online experience is directed by artificial intelligence and complex algorithms in myriad ways.

Over the past two decades, notice-and-choice–based data privacy laws in the US have generally been pieced together as reactions to previous breaches of trust by companies and agencies over the “vulnerable” parties in the relationship. The laws themselves are based on somewhat arbitrary findings from the 1973 FIPPs report. This legal framework has led to administrative challenges with companies having to navigate a maze of rules, which vary across states, sectors, and at the federal level.

Differing laws mandate that company websites follow privacy policy guidelines that fall short of creating fairness on both sides of the company/user relationship. Usually the policies are confusing, lengthy, full of legal jargon, and as a result are read infrequently. Studies have found that the average internet user would spend 244 hours per year reading them. There is a growing body of literature addressing the monetary value of our time as well. According to one study, the average worker’s time would cost more than $1,700 per year just to skim privacy policies.6

Notice-and-choice puts the bulk of the responsibility in the hands of the consumer of protecting their own privacy instead of powerful Data Collectors. Furthermore, once an individual agrees to a privacy policy and discloses their data, they have little or no control over how it is used. Tech companies that have access to their users’ personal information should be legally required to handle that information with the highest level of trust. The current set of laws depends on the idea that if the company notifies its users of some of the what-when-how of their data-collection practices, users may then make educated decisions about whether or not to share that personal information. This model is flawed in myriad ways, from the very basis of the theory, all the way through to the logistical implementation of the resultant laws.

1.8 Privacy-as-Trust: An Alternative Model7

Online social networks are rooted in trust. This ranges from run-of-the-mill daily interactions with family, relatives, and friends to sharing information with strangers who may have or will reciprocate with us. Trust is the expectation that receivers of our information will not share it for their own interest and uses, commercialize it, or share it for other nefarious ends. If it is used for commercialization, information providers should expect consideration in the form of revenue-sharing fees.

The presumption of trust is at the core of our decisions to share our personal information with others. In the technologically driven online framework, the uses of our personal information include:

National security and law enforcement

Storing our data to provide convenience services

Commercialization of our information – selling information for commerce

Intrusion or theft of data

Influencing our thinking and decisions

The notion of Big Brother knowing everything about us with nonblinking eyes has persisted with time and has become more pronounced with the explosion in online connections and communication. It was originally enforced by law to monitor and ascertain allegiance to the ruling regime. This instrument was not sugarcoated under the guise of free services that foster trust.

The governmental Big Brother and his watching mechanisms are derived from public funds. And the means to the end is to ensure law enforcement and, in oppressive regimes, to observe allegiance and loyalty from the subjects of the regime. In contrast, Facebook, Amazon, Google, and other online Data Collectors do not have a Big Brother–like oppressive persona. In the guise of making life easier for their users, they provide seemingly cheap/free seamless services – be it finding a restaurant in our neighborhood at the very thought of hunger, selecting a movie to watch, or just finding our misplaced phone.

Privacy-as-trust is based on fiduciary (in Latin, trust) law. Most agree that Data Collectors have “asymmetrical power” over the average consumer. Thus, according to common law fiduciary principles, Data Collectors should be held to higher standards when entrusted with our personal data. They should act based on common principles of trust. As opposed to contract law (the body of law that relates to making and enforcing agreements) or tort law (the area of law that protects people from harm from others), fiduciary law centers around a few special relationships wherein the fiduciary – the individual who holds the more powerful role in the relationship – has an obligation to act in the best interest of the other party. Examples of fiduciaries include investment advisors, estate managers, lawyers, and doctors. If a patient goes into surgery their life is in the hands of the doctor.

Fiduciaries are entrusted with decisions about their clients’ lives and livelihoods. When we share our personal information, we should expect it to be handled equally responsibly. The implication is that a fiduciary relationship between data brokers and users would help fight the power imbalance that exists in online interactions and commerce, and that is growing exponentially.

In this construct, companies like Google, Facebook, and Uber should be considered fiduciaries because of internet users’ vulnerability to them. We depend on them, and they position themselves as experts in their fields and presumably trustworthy. Our contention is that corporate privacy strategy should be about maintaining user trust. Privacy leaders within corporations would often prefer to position the company in terms of trust and responsibility as opposed to creating policies that are designed to avoid lawsuits. They should go a step further and revisit their policies on a regular basis to keep up with the ever-changing and fair expectations of the clients depending on their understanding of the changing realities of internet privacy or lack thereof.

Many privacy policies on company websites are hard to read (generally in a light gray font), difficult to locate within their websites, confusing, and take too much time to review. Google continues to face criticism for its ability to track and record users’ locations with their Maps application, even when the “Location History” feature is turned off. At a Google Marketing Live summit in July 2018, Google touted a new feature called “local campaigns,” which helps retail stores track when Google ads drive foot traffic into their locations. They can also create targeted ads based on users’ location data. Once Google knows where you spend your time, nearby store locations can buy ads that target you directly. Even when users have turned off location history, Google can use mechanisms in their software to store your information. For speed and ease, most users allow Google to store their location history without serious consideration.8

Fiduciary companies should have further obligations to the individual data providers/customers than being limited to clarifying their privacy policies. They should agree to a set of fair information practices as well as security and privacy guarantees, and timely disclosure of breaches. Most importantly, they should be required to represent and “promise” that they will not leverage personal data to abuse the trust of end users. In addition, the companies should not be allowed to sell or distribute consumer information except to those who agreed to similar rules.

1.9 Applying Privacy-as-Trust in Practice: The US Federal Trade Commission

In this construct, US companies should not be allowed by the Federal Trade Commission (FTC) to induce individual data providers’ trust from the outset, market themselves as trustworthy, and then use that trust against us. As an illustration, Snapchat promoted their app as a way to send pictures to others that would only be available to the receiver for a preset time duration. However, there are ways for the viewer to save those pictures outside of Snapchat’s parameters, such as taking a screenshot. While the image is “ephemeral” within the Snapchat app, the company failed to mention that the image does not necessarily disappear forever. Under privacy-as-trust law, Snapchat would be in breach of their legal obligations as a trustee.

In the US, the FTC has substantial experience with “deceptive business practice” cases under Section 5 of the FTC Act of 1914, which simply prohibits unfair methods of competition and unfair acts or practices that affect commerce. A good parallel to internet data collection could be drawn from the telemarketing industry. The Telemarketing Sales Rule states:

… requires telemarketers to make specific disclosures of material information; prohibits misrepresentations; … prohibits calls to a consumer who has asked not to be called again; and sets payment restrictions for the sale of certain goods and services.

– ftc.gov

In this rule, applying the clause “prohibiting misrepresentation” in general to digital data collection and commerce would be a profound change. Currently companies often use confusing language and navigation settings on their apps and websites to present their privacy policies. It can be argued that this is misrepresentation of their goods and services.

1.9.1 Facebook as an Example

The scope of Facebook’s role within the complex issues surrounding data sharing and privacy cannot be overstated. Learning from the failures of MySpace and Friendster, Facebook has clearly triumphed in the social media domain. This is partly due to the public relations prowess of Mark Zuckerberg, its founder and CEO, “especially in light of the maniacal focus on an advertising-dependent business model based on mining users’ data, content and actions.”9

According to the Pew Research Center on February 1, 2019, approximately 68% of US adults use Facebook and three-quarters of those users visit the site at least once per day. However, 51% of those users state that they are uncomfortable with “the fact that the company maintains a list of the users’ traits and interests.” In addition, 59% of users said that the advertising on their NewsFeeds accurately reflected their interests (Pew). Ads on Facebook are seamlessly mixed with and appear in exactly the same format as our friends’ posts, with the exception of the word “Sponsored” in a tiny, light gray font. If a friend “Likes” one of these sponsored posts, Facebook will alert us of that, and we are more likely to click on it, since we trust the friend. Once the algorithm has been proven to work, Facebook can charge more for their advertising real estate and continue to dominate the social media market. It is very likely that these types of strategies and constant honing of data analysis to target their users would violate privacy-as-trust.

There are no real alternatives for avoiding these formulas; other big social media sites like Instagram use similar tactics. As a counterpoint, a start-up company like Vero stores usage stats but only makes them available to the users themselves. However, Vero has only about a million users – it is unlikely that you will find your friends and family on it.

The FTC could intervene through several straightforward mechanisms. While eliminating third-party advertising altogether would be a heavy-handed and an unlikely action, the FTC could push for design changes to make it easy for users to spot advertising. Facebook could simply be prohibited from using personal data to create targeted ads. The FTC’s deceptive practices actions have been broad and there are legal precedents for this. Any website exploiting this personal data against the interests of the users could fit under the FTC’s existing authority and balance the power between users and data.

1.10 Additional Challenges in the Era of Big Data and Social Robots

The growing use of “social robots” adds a significant challenge to the data privacy debate. Social robots use artificial intelligence to interact and communicate with humans and possibly with their brethren. They require massive amounts of data to be effective. They learn from us through our choices and actions on their platforms, e.g. Facebook. By using their platforms, we feed them our data and train them. In turn, they increasingly evolve their abilities to influence our thoughts and decisions. This develops into a vicious cycle.

This phenomenon does not stop with our clicks and swipes. Social robots can also utilize data from our physical appearances. For example, robotic shopping assistants in the form of algorithms have been designed to keep track of our past purchases and recommend future buying. When sellers program robots to suggest weight-loss or wrinkle cream products based on appearance, the possibility of data-based discrimination with respect to sex, age, and race will be unavoidable.

1.10.1 What Is a Social Robot?

In order to address this challenge from a legal perspective, the term “social robot” should be defined. There are numerous examples in fiction and popular culture – Rosie from The Jetsons, C3PO, Wall-E, and even going all the way back to mythological legends of bronze statues coming to life. These myths have become a near virtual reality – Rosie, the Jetsons’ memorable housekeeper is the closest to existing social robots.

The current generation of social robots utilize programmatic actions and have limited human-level autonomy, as opposed to C3PO, a more relatable human character, who while possessing “robotic” vocal and mechanical qualities, also comes with his own set of human emotions.

The legal definition of social robots should be characterized by the following traits/capabilities:

Embodied

(they have a physical form – software)

Emergent

(they learn and adapt to changing circumstances)

Social valence

(they are thought of as more than an object and have the ability to elicit emotional social responses from their users)

1.10.2 Trust and Privacy

Because of our innate need for socialization, we are predisposed to anthropomorphize even inanimate objects. To feed this vulnerability, robots are designed to resemble humans in appearance, traits, and aura in their movements. Researchers have provided examples of humans bonding with robots and experiencing feelings of love, with some even preferring the company of robots over human beings.

Social robots are programmed to be more responsive and predictable than humans. They trigger our predisposition to relate to them on a human level to the point that they gain our trust. We are likely to develop greater trust in social robots than in humans. Trust leads to dependency, with susceptible consumers willing to spend unreasonably large amounts of money to keep them “alive” and functioning.

As our reliance on social robots to conduct our daily lives grows, we allow them to share our data – playing into the inherent mission of companies that create and deploy them.

Traditional constructs of privacy are based on individual separation, autonomy, and choice. As we choose to interact with technology that induces us to provide increasing amounts of data to feed social robots, how do we remain separate from it? Through our trust in social robots we are thus made increasingly vulnerable to the companies that create these artificial intelligence/social robot technologies.

1.10.3 Legal Framework for Governing Social Robots

It is extremely challenging for privacy policies to protect users from big data’s algorithmic targeting and predictive analytics that drive social robots. This is because intellectual property laws protect the companies’ algorithms, and thus consumers cannot be provided with sufficient information or notice to make informed choices to allow a company/website to access, store, or share that data. Social robots are not humans; they are virtual machines driven by software to collect our data by inspiring trust and inducing us to drop our privacy guards.