Data Conscience - Brandeis Hill Marshall - E-Book

Data Conscience E-Book

Brandeis Hill Marshall

0,0
25,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

DATA CONSCIENCE ALGORITHMIC S1EGE ON OUR HUM4N1TY EXPLORE HOW D4TA STRUCTURES C4N HELP OR H1NDER SOC1AL EQU1TY Data has enjoyed 'bystander' status as we've attempted to digitize responsibility and morality in tech. In fact, data's importance should earn it a spot at the center of our thinking and strategy around building a better, more ethical world. It's use--and misuse--lies at the heart of many of the racist, gendered, classist, and otherwise oppressive practices of modern tech. In Data Conscience: Algorithmic Siege on our Humanity, computer science and data inclusivity thought leader Dr. Brandeis Hill Marshall delivers a call to action for rebel tech leaders, who acknowledge and are prepared to address the current limitations of software development. In the book, Dr. Brandeis Hill Marshall discusses how the philosophy of "move fast and break things" is, itself, broken, and requires change. You'll learn about the ways that discrimination rears its ugly head in the digital data space and how to address them with several known algorithms, including social network analysis, and linear regression A can't-miss resource for junior-level to senior-level software developers who have gotten their hands dirty with at least a handful of significant software development projects, Data Conscience also provides readers with: * Discussions of the importance of transparency * Explorations of computational thinking in practice * Strategies for encouraging accountability in tech * Ways to avoid double-edged data visualization * Schemes for governing data structures with law and algorithms

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 565

Veröffentlichungsjahr: 2022

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Title Page

Foreword

Introduction

Part I: Transparency

Note

CHAPTER 1: Oppression By…

The Law

The Science

Summary

Notes

Recommended Reading

CHAPTER 2: Morality

Data Is All Around Us

Morality and Technology

Misconceptions of Data Ethics

Limits of Tech and Data Ethics

Summary

Notes

CHAPTER 3: Bias

Types of Bias

Before You Code

Bias Messaging

Summary

Notes

CHAPTER 4: Computational Thinking in Practice

Ready to Code

Algorithmic Justice Practice

Code Cloning

Summary

Notes

Part II: Accountability

Note

CHAPTER 5: Messy Gathering Grove

Ask the Why Question

Collection

Reformat

Summary

Notes

CHAPTER 6: Inconsistent Storage Sanctuary

Ask the “What” Question

Files, Sheets, and the Cloud

Modeling Content Associations

Manipulating with SQL

Summary

Notes

CHAPTER 7: Circus of Misguided Analysis

Ask the “How” Question

Misevaluating the “Cleaned” Dataset

Overautomating k, K, and Thresholds

Not Estimating Algorithmic Risk at Scale

Summary

Notes

CHAPTER 8: Double-Edged Visualization Sword

Ask the “When” Question

Critiquing Visual Construction

Pretty Picture Mirage

Summary

Notes

Part III: Governance

Note

CHAPTER 9: By the Law

Federal and State Legislation

International and Transatlantic Legislation

Regulating the Tech Sector

Summary

Notes

CHAPTER 10: By Algorithmic Influencers

Group (Re)Think

Flyaway Fairness

Moderation Modes

Summary

Notes

CHAPTER 11: By the Public

Freeing the Underestimated

Learning Data Civics

Condemning the Original Stain

Tech Safety in Numbers

Summary

Notes

APPENDIX A: Code for

app.py

A

B

C

D

APPENDIX B: Code for

screen.py

A

B

C

APPENDIX C: Code for

search.py

A

B

C

D

APPENDIX D: Pseudocode for

faceit.py

APPENDIX E: The Data Visualisation Catalogue's Visualization Types

APPENDIX F: Glossary

Index

Copyright

Dedication

About the Author

About the Technical Editor

Acknowledgments

End User License Agreement

List of Tables

Chapter 1

Table 1.1: Egyptian numeral system

Chapter 2

Table 2.1: Similarities between tech ethics and human ethics

Table 2.2: Similarities among data, tech, and human ethics

Chapter 3

Table 3.1: Data ethics principles and its motivating questions

Table 3.2: Synthetic (fake) employee applicant data from résumés

Table 3.3: A skills listing in order of greater to lesser experience

Table 3.4: A skills listing with no further indicators of proficiency (by de...

Chapter 4

Table 4.1: Word frequency count for

Computational Thinking

Chapter 5

Table 5.1: Data sourcing questions

Table 5.2: Handling missing or unknown data

Chapter 6

Table 6.1: Data manipulation questions, part 1

Table 6.2: Synthetic (fake) résumé data from Chapter 3

Table 6.3: Hashtags, Blacktags, and keywords for the Black Twitter Project

Chapter 7

Table 7.1: Data manipulation questions

Table 7.2: Output after

fullMovies.describe()

code execution

Chapter 8

Table 8.1: Data interpretation questions

Chapter 10

Table 10.1: Rocchio's algorithm explained (simply?)

Chapter 11

Table 11.1: Data civics stages

Table 11.2: Actual data industry representation by ethnicity/gender demograp...

Table 11.3: U.S. population representation by ethnicity/gender demographic

Table 11.4: Troll subversion recommendations by social platforms

List of Illustrations

Chapter 1

Figure 1.1: A quoted tweet referencing the ‘Negro Women To Be Put To Work’ a...

Figure 1.2: (a) Single image of face from Turk and Pentland's 1991 facial re...

Figure 1.3: Mugshot of Alphonse Bertillon (1853–1914) with a reference card ...

Chapter 2

Figure 2.1: ACM 2020 Computer Science Curriculum Guide's recommendation on s...

Figure 2.2: ACM 2020 Computer Science Curriculum Guide's prioritization of s...

Figure 2.3: Examples of high-resolution images that are downscaled to low-re...

Chapter 3

Figure 3.1: Timeline of Amazon's AI hiring software product

Figure 3.2: Summary of Margaret Mitchell's “Bias in the Vision and Language ...

Figure 3.3: Parts of a Car Wheel

Figure 3.4: Illustration of how a wheel infrastructure transfers impact

Figure 3.5: Four potential bias wheels covering business, human, data and al...

Figure 3.6: Sample résumés returned from a Google image search

Figure 3.7: Snapshot of online résumé entity-relationship diagram

Chapter 4

Figure 4.1: Concept map of how I wash my locs

Figure 4.2: Sampling of computing services and platforms costs by max data c...

Figure 4.3: ML-based résumé screening system on GitHub

Figure 4.4: Skeleton structure of ML-based résumé screening system.

Chapter 5

Figure 5.1: Example of website cookie acceptance request (top) and cookie se...

Figure 5.2: Sample Google engine result of phrase “open source dataset”

Figure 5.3: PeopleFinders results for Brandeis Hill Marshall

Figure 5.4: Snippet of

Data.gov

dataset (uncleaned)

Figure 5.5: Snippet of retooled

Data.gov

's dataset (cleaned)

Figure 5.6: Snippet of Twitter data scraped using Tweepy API

Figure 5.7: Snippet of streamlined Twitter code in CSV format

Chapter 6

Figure 6.1: Snapshot of online résumé ERD

Figure 6.2: Twitter data usage for research purposes (June 2015)

Figure 6.3: Applicant demographic identities

Figure 6.4: (Left) A DBA's view and (right) an applicant's view of demograph...

Figure 6.5: An expanded ERD representation

Figure 6.6: General syntax of the

SELECT

statement used in DML

Figure 6.7: SQL statement processing order

Chapter 7

Figure 7.1: AI, ML, and DL quick explanations

Figure 7.2: Descriptive statistics of ratings column from Kaggle

Figure 7.3: Linear regression of UserID and MovieID

Figure 7.4: Linear regression of Rating and MovieID

Figure 7.5: Linear regression of Timestamp and MovieID

Figure 7.6: Ordinary least squares outputs using Timestamp and MovieID

Figure 7.7: Translucent Brandeis and Visible Brandeis using Zoom Background ...

Figure 7.8: Examples of deepfake technology

Figure 7.9: Deepfake of Jordan Peele as former U.S. President Barack Obama...

Figure 7.10: Princess Leia remastered using deepfake technology in the film ...

Figure 7.11: Deep Nostalgia animates the faces in your family photos with am...

Chapter 8

Figure 8.1: Content consumes our lives.

Figure 8.2: Gaping Void's representation of data, information, knowledge, in...

Figure 8.3: My January 5, 2022 Twitter post

Figure 8.4: A choropleth map of the United States indicating the states with...

Chapter 9

Figure 9.1: State data privacy legislation status (as of March 2022)

Figure 9.2: Black population distribution according to U.S. Census 2020

Figure 9.3: Data, some of the industries it affects, and the AI-based tools ...

Chapter 10

Figure 10.1: False narration of the Black tech pipeline problem

Figure 10.2: How content moderation works

Figure 10.3: Original tweet post by CNN's Rick Santorum saying that “We birt...

Figure 10.4: Twitter response to alleged community code violation: “For thos...

Chapter 11

Figure 11.1: Data civics Stage 1: people's relationship to data

Figure 11.2: The data life cycle and surrounding data ecosystem

Figure 11.3: Profit Without Oppression's Guiding Principles, by Kim Crayton...

Figure 11.4: Adapted from “The Chronicle of the Problem Woman of Colour in a...

Appendix E

Figure E.1: The following data visualizations are displayed: error bars, flo...

Figure E.2: The following data visualizations are displayed: pie chart, poin...

Figure E.3: The following data visualizations are displayed: arc diagram, ar...

Figure E.4: The following data visualizations are displayed: timetable, tree...

Guide

Cover

Title Page

Copyright

Dedication

About the Author

About the Technical Editor

Acknowledgments

Foreword

Introduction

Table of Contents

Begin Reading

APPENDIX A: Code for app.py

APPENDIX B: Code for screen.py

APPENDIX D: Pseudocode for faceit.py

APPENDIX E: The Data Visualisation Catalogue's Visualization Types

APPENDIX F: Glossary

Index

End User License Agreement

Pages

iii

xix

xx

xxi

xxii

xxiii

xxiv

xxv

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

184

185

186

187

188

189

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

206

207

208

209

210

211

213

214

215

216

217

218

219

220

221

222

223

224

225

226

227

228

229

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

274

275

276

277

278

279

280

281

282

283

284

285

286

287

288

289

290

291

292

293

294

295

297

298

299

300

301

302

303

304

305

306

307

309

310

311

313

314

315

316

317

318

319

320

321

322

323

324

325

326

iv

v

vii

ix

xi

xii

327

Data Conscience

Algorithmic Siege on Our Humanity

 

Brandeis Hill Marshall

 

 

 

 

Foreword

There is an assumption that we need to code switch, straighten our hair, get rid of that accent, get rid of those big earrings, not be too much of a Black woman, but instead assume an identity that is considered “the norm” in computer science spaces in the United States. This undertone exists in every single text book containing code snippets that I have read. There is a pretense that these books were written from no one's point of view, exuding an air of neutrality that we know doesn't exist, what feminist scholars have dubbed “the view from nowhere.”

This tone is that of a cishet white man, seen as the norm while everyone else is considered a deviation. Dr. Marshall shatters that expectation. With DataedX, Black women in Data, the Rebel Tech newsletter, her many educational events and this book, Data Conscience, Dr. Marshall brings her full self to work and encourages us to do the same. Reading this book feels like we're having an honest conversation about data and code over our favorite food or drink.

Exclusionary views of who is considered a data expert, are one of the many ways that Black people are gate kept from a field that profits off of our datafication and criminalization. As Prof. Ruha Benjamin, sociologist and associate professor of African American Studies at Princeton University, notes, technology allows racism to enter through the backdoor: the false assumption of neutrality masks ingrained, discriminatory, systems, while at the same time we are told, implicitly and explicitly, that we do not fit into the computer science archetype.

This book tells us that we do fit, and helps us develop toolkits for responsible computing. It goes through all aspects of the data lifecycle including the processes of gathering, labeling, cleaning, storing, and governing data, explaining how each of these steps affects us. Dr. Marshall moves us away from the notion that everyone should be datafied and every problem should be solved by technology. Instead, she explains the complex societal layers that we lose when we attempt to convert everything into discrete data points. She asks us to consider, and resist, the many ways in which we are being taught to act like machines, and gives us tangible tips for how to actively push back to retain our humanness and Black identities in this world of data.

For those of us looking to hone our data skills, this book helps us do that with examples of what could go wrong, and how to avoid those paths. Contrary to textbooks that abstract out one aspect of the data lifecycle without any societal context, Dr. Marshall reminds us that “coding requires a 360 degrees of panoramic view that requires more than coders in order to see, understand, capture and partially address the social, technical and ethical considerations.”

I hope you enjoy this book as much as I have. As Dr. Marshall says, every one of you is a data person, and you are enough. We need you to enter, and if you're already in, stay and thrive in this field so that we have a technological future that works for all of us and not only the select few.

—Dr. Timnit Gebru,

Founder and Executive Director of the Distributed Artificial Intelligence Research Institute (DAIR)

Introduction

Source: https://twitter.com/csdoctorsister/status/1536343596336418821

My journey to data conscientiousness started when I was a kid as I rolled coins with my mom and helped my dad organize his job's employee resource group's annual membership rosters. My mom would bring out the big Welch's jar, about half full of loose change. Sitting on the living room floor, she'd dump all the coins on the carpet and we'd start separating them by denominations. Mom would bring out all the coin wrapper rolls she'd gotten from the bank. We'd stack the pennies, nickels, dimes, and quarters—and talk about whatever moms and daughters talk about. She taught me how many of each denomination goes into each coin wrapper roll: 50 pennies gives us 50 cents, 40 nickels gives us $2, 40 quarters gives us $10, and 50 dimes gives us $5. When we'd filled as many of the rolls as we could, we'd count up our earnings. Sometimes it would be $30, and other times it would be closer to $100.

At first, I simply liked the counting, the talking, and stuffing the coins in those small paper wrappings. As I grew up, I started to associate these coins as a resource to get what I wanted. That 50 cents could be put to excellent use to get some SweeTARTS. Two dollars in nickels would keep my candy stash stocked for a week. Five dollars would pay for my favorite order at Swenson's and I'd have change left over. In college, $10 in quarters was gold because no laundry would have been done otherwise. Looking back, I realize it was her way of having me practice my counting, learning the many ways to make a dollar using coins and the value of saving.

My dad, for more than a few years, had this annual huge task of verifying each chapter's membership rosters for his job's nationwide employee resource group. The first year or two or three, Mom and I watched him. Somewhere along the way, I started to help out when asked at first and even volunteered, maybe once. My recollection is fuzzy. What I remember vividly was that the amount of mail he received took down a few forests. There were endless printouts on standard perforated paper from those old dot-matrix printers. The basement became overrun with boxes of unopened envelopes of various sizes, from letter-sized to overstuffed legal-sized.

While my dad was figuring out which pile to tackle, opening each envelope started my supply chain of organization and sorting: record the chapter location and region, clip the membership roster to the envelope, highlight the number of chapter members, leave the chapter's membership dues checks in the envelope, and add this new envelope to the other envelopes in that region pile. And yes, each chapter printed and snail-mailed their membership rosters. The cross-checking of the mailed rosters year after year was dizzying. Some chapters used a great printer and had access to plenty of printer ink. Other chapters weren't so fortunate. My young eyes were called upon to read the smeared and faded letters.

Reconciling these membership rosters took weeks of shifting from one pile to the next. Some people changed chapters due to job relocations but didn't update their membership affiliation. Other people decided to not be part of the employee resource group anymore and didn't follow the member pause/deactivate process. The employee resource group finally created an online database—my dad had something to do with this, I'm sure.

Rolling coins introduced me to data as numbers, math, and financial literacy without being intimidating. Organizing chapter membership rosters introduced me to data as people, context. and guideposts to decisions. Armed with this understanding, I found that school was just one cool place where reading, math, and general exploration of data things happened.

But while roaming the computer science hallways at the University of Rochester I came to recognize how data was viewed by the world. The Year 2000 problem, the Y2K bug, dominated the headlines my junior year. Everyone seemed so concerned about the computing infrastructure and whether systems would “hold up” after the clock struck 12 a.m. on January 1st, 2000. Businesses were desperately trying to back up their data on file servers, zip drives. and 3.5-inch disks. My classmates began signing big money employment contracts with signing bonuses by that fall. They were focused on refining their computer systems and networking skills. That's what employers wanted. I predicted that this dotcom boom was about to bust, so I elected to pursue graduate studies.

I saw the additional concern that businesses feared of not having their data. Everything I could think of had a critical connection to data, particularly why, how, and what we digitally house in systems. And society was singularly focused on the systems themselves. I believed then, as I do now, that data runs the world. I decided to go all in on data in graduate school and my career.

Data gets a bad reputation as a pseudo demon spirit creature because all the numbers and math are deemed complicated, confusing, and not relatable. Data is not a tangible concept to many people—those in computing, tech, and data spaces and those who are not in those spaces. We all, to some degree, are in digital spaces where data lives. Critiquing data uses involves all of us, but for those of us in the data trenches, there's a bigger pressure to suss out the issues and course-correct before the tech product goes public.

This book is for the rebel tech talent, those who acknowledge and are ready to address the limitations of software development. They recognize that tech's philosophy and practice of “move fast, break things” is inherently problematic, and needs to be changed, and they want to pinpoint the ways discrimination exists in this digital data space. The primary reader for this book, however, is the entry-level software developer or data analyst. But frankly, it should be considered a reference guide to making more responsible and equitable data connections.

Data Conscience translates theory to practice. The gaps in our current data infrastructure are spotlighted so that data practitioners know more precisely where issues exists. And I'm centering the most vulnerable, ethical issues and resolutions to address social, political, and economic implications and not just computational ones like optimization, load balancing, and latency.

What you will read in this book is a blend of social sciences, humanities, and data management with tangible, real-world examples. Consider it a modern antemortem describing specific instances of where ethical flags are raised and how data structures help or hinder ethics resolutions. I focus on being preemptive in handling data operation for inclusion rather than conducting conversational (generic) autopsies of case studies and algorithmic audits.

The book is divided into three parts. Part I, “Transparency” (Chapters 1–4), takes you on the rollercoaster of how outcomes and impacts of data, code, algorithms, and systems are revealed to all of us by companies, organizations, and groups. Part II, “Accountability” (Chapters 5–8), covers ways in which data and software teams can critique and explore interventions to make responsible data connections during the tech building phase. And lastly, Part III, “Governance” (Chapters 9–11), reviews the action steps taken thus far and ends as a public accountability manifesto on what all of us can do to humanize our relationship to data.

Here's a brief chapter-by-chapter overview:

Chapter 1

explores the role data has played in our society, particularly in the United States—how we've handled it and our relationship to handling it well. Oppression tactics, in the law and in the sciences, are mere social controls to enforce a hierarchy positioning that doesn't exist.

Chapter 2

describes for those of us on the “inside” of tech how we're torn by this realization that the code we write is likely contributing to a cycle of harm that we don't know how to curtail, stop, or dislodge ourselves from. Reconciling—and more to the point accepting—imperfection in data and tech needs a place in tech. The choice between error or no error doesn't exist anymore. There's a third choice: nontech-solvable.

Chapter 3

tackles the term “bias” and its multitude of interpretations head on. I describe how bias shows up and ways to shift our mindset on how we recognize and handle it, even before we write a single line of code. Getting overwhelmed and disengaging in combatting bias efforts is no longer an option.

Chapter 4

stretches our minds about what we've accepted as computational thinking and standard discussion points to fold in a more intentional socio-ethical tech understanding. Coding requires a 360-degree panoramic view that requires more than coders in order to see, understand, capture, and partially address the social, technical, and ethical considerations.

Chapter 5

focuses on asking the “why” questions, especially as part of data collection and reformat practices. Tech does a poor job of handling data collection and reformat. Learning to ask questions early, often, and with real people in mind streamlines how we manage data operations as a data, computing, and larger tech community.

Chapter 6

focuses on asking the “what” questions, especially as part of data storage and management procedures. We must come to grips with the fact that the data storage landscape is a culmination of intentional, yet sometimes harmful, decision-making exercises with social, computational, and morality implications.

Chapter 7

focuses on asking the “how” questions, especially as part of data analytics. The algorithms, systems, and platforms are taken from the same playbook, by the same homogeneous people. There's much we can learn about data from our comrades in the humanities and social sciences.

Chapter 8

focuses on asking the “when” questions, especially as part of data visualization. The allure of data visualizations is enticing, so proceed with caution with every chart, graph, and dashboard you encounter.

Chapter 9

snaps us back to reality as tech moves fast while the law moves slow. Juggling the usefulness versus persnickety hindrances of the law dominates this chapter. Focusing on the fundamental building blocks of tech, like data and the algorithms that use it, has traction and momentum rather than constructing legislation that's fixated on applications of technologies.

Chapter 10

discusses tech's dominance in our society and, in particular, as a culture that excludes other solution options. The rockstar tech workers, or algorithmic influencers, guide every industry one software update, code version release, or code library at a time. But clearly, we've hit a ceiling in what tech should do.

Chapter 11

is my public accountability manifesto. We're each battling for our dignity in digital spaces, and we must do so with the same indignant veracity as we do in physical spaces. The tech industry's ability to operate, for the most part, without impunity puts more onus on us, as global digital citizens, to maintain intense pressure for transparency, accountability and governance in all spaces where data resides. Algorithmic processes, systems, platforms, and institutions won't become responsible or equitable without us making it a requirement, rather than a choice.

Part ITransparency

In This Part

Chapter 1:

Oppression By…

Chapter 2:

Morality

Chapter 3:

Bias

Chapter 4:

Computational Thinking in Practice

Welcome to Part I, “Transparency.” Our society today is obsessed with using technology. The power that technology brings to people's ability in completing perceived mundane tasks is astonishing and lightning fast. The “hurry up and complete more tasks in a fixed time frame” mindset overlooks the cascading and compounding influences of those activities on the human condition. But this isn't the first time the outcomes overshadowed the people. There's an uncomfortable U.S. history of sidelining the humanity of certain people so that others can thrive. The journey starts at this uncomfortable place when technology wasn't prevalent. How we as a society are living through this technology era isn't new, but simply a reincarnation of how the world has operated in the past. Being transparent about the pre-tech era illuminates how tech systems came to be designed with so many flaws. The data, code, algorithms, systems, and platforms that make the tech industry thrive are a wicked web of assumptions, presumptions, and opaqueness.

Transparency (n) – Revealing your data and code. Bad for proprietary and sensitive information. Thus really hard; quite frankly, even impossible. Not to be confused with clear communication about how your system actually works.

—Karen Hao, in “Big Tech’s Guide to Talking About AI Ethics,”www.technologyreview.com/2021/04/13/1022568/big-tech-ai-ethics-guide1

Transparency (n) – Revealing outcomes and impact of the data, code, algorithms, and systems by companies, organizations, and groups.

—Revised Definition by Brandeis Hill Marshall

Data is the fuel that runs our algorithms and systems. Chapters 1–4 have us get inside data as a construct, structure, and limiting factor when digitized. We struggle to wrap our minds around data within and outside of digital infrastructures. Ultimately, it's up to the collective we to be clear-eyed in our expectations of data and its uses.

Source: https://twitter.com/csdoctorsister/status/1315376187632476161

Note

1

.  Hao, Karen. “Big Tech’s Guide to Talking About AI Ethics.” MIT Technology Review, 2021.

www.technologyreview.com/2021/04/13/1022568/big-tech-ai-ethics-guide

.

CHAPTER 1Oppression By…

Source: Twitter, Inc.

In tech, the approach is to create a solution that works well for 80 percent of users. The remaining 20 percent have to conform, find workarounds, or suffer in silence. Those on the fringe aren't considered core demographic or essential members of the targeted population. So tech folks don't want to discuss or address racism, sexism, ableism, and otherism. Because each of these distinguishing characteristics are lifelong, minoritized people are not fully comfortable in society. In this book, we explore the role of data in digital spaces and our relationship to handling it well. The intended use of data, how it's represented in digital systems, and the resulting impact of outcomes play heavily in the moral fabric of data management. Leaning on this foundation, we'll work in ways to clarify assumptions, be more data-aware in our tech products, and flatten bad aftereffects. We'll share conditions to look out for to be more thoughtful and to carve out opportunities to make alternative design decisions.

We start head-on with oppression because it's not a new occurrence to our global society. Tech has simply reinvented it. To drive this point home, I'll describe how oppression manifests in the law and in the sciences. I share a brief snapshot of the history of oppression and how data is leveraged to solidify a social structure and order. I outline where society has been so we aren't more likely to repeat it. Calling out oppression by one's race, gender identity, and/or class frees us from doing “nice diversity.” I'm not just talking about things you know or have heard or have read. I'm digging into the impact of data use and misuse, over decades that turned into generations. This impact is not a collective of fuzzy faces and lives; it's always personal. It's you and it's me. The law and the sciences try to manage the intentional, calculated, and coordinated instances of oppression in different ways. Their words are empty: “They may forget what you said—but they will never forget how you made them feel,” as said by Carl W. Buehner. The outcomes are what matter.

The Law

Anti-Blackness coupled with white colonialism is a global phenomenon. Strict and full domination by a group imposing ethnicity or race barriers has set the model that lives in every system, to varying intensities, that all of us use to navigate our lives.1 The model reinforces itself and feeds on itself. It's a relentless pursuit to maintain that complete domination. Throughout history, policies and laws have removed, altered, or restricted the birthright humanity of Black people. Not that other ethnic populations have not experienced this treatment, but Black people, especially in the United States, have combatted and are still combatting the strain of slavery. In the physical offline world, Dr. Joy DeGruy's Post Traumatic Slave Syndrome: America's Legacy of Enduring Injury and Healing (Joy DeGruy Publications Inc., 2017) cements the compounding trauma of enslavement within the Black community. DeGruy describes how Blackness, Black culture, and Black people were associated with laziness, dirtyness, uncivilized, and animalistic preconditions. Humanity was severed from Black people way before the slave trade. This social construct continues to today as Africa must contend with algorithmic colonialism (https://repository.law.umich.edu/mjrl/vol24/iss2/6), an international digital war over extracting Black African people's data in the absence of laws protecting its citizens. The United States has been extracting labor, intelligence, and dignity from Black people for centuries.2

Slave Codes

White colonization is centuries old—let's drop into this history in the 1800s. In the United States, slavery ruled, though some abolitionists, both white and Black, raised concerns about human and civil rights violations. Slave codes were laws and practices that confined Black people to white rule, that forbade them from obtaining an education, independence, and ultimately dignity. The social controls of slave codes were broad and severe. For instance, a Black person accused of a crime could be met with a death sentence. Black people were not allowed to serve on juries, so if you were a Black person accused of a crime, you would be judged by an all-white jury. And let us not forget the first rule of slavery: the enslaved, Black people, were considered property, listed by first name only and age in a ledger like an asset. For instance, in 1787, soon after the American Revolutionary War, the North and South states “decided” an enslaved person was counted as 3/5 of a person as a “compromise” used to determine taxation and representation in the House of Representatives. Black people were exploited for white fiscal and political benefit. Recognize any parallels with the tech industry today? #askingforafriend.

By the 1850s, the murmurs became loud sirens. Higher education institutions, later coined in 1965 as historically black colleges and universities (HBCUs), were founded and designed for Black students. The outcomes of the American Civil War propelled Black people's advancement from 1865 to 1890: many HBCUs were founded in 1867, a handful of Black men were elected to the U.S. House of Representatives, and it ended with the U.S. federal government granting land to over 15 HBCUs. The post-slavery era came into full view, where the abolitionists, white feminists, and their respective movements rejoiced in perceived equality for Black people. They thought that their petitions had been effective in enacting long-lasting change and the war for treating Black people as 5/5 of a person was achieved. They were wrong—very wrong.

Black Codes

But slave codes didn't disappear with the end of the American Civil War. Each Southern state didn't like, agree with, or accept the 14th Amendment to the U.S. Constitution, ratified in 1868. Slave codes turned into Black codes. Black codes were local, regional, and state laws that continued to pose social controls on Black people. They sneakily started popping up all over the Southern states in response to the Emancipation Proclamation issued on January 1, 1863. Take note of the five-year gap between the proclamation and the 14th Amendment ratification. Chalk it up to lots of dragging feet and stall tactics by some Southerners, with the enabling of some Northerners to curtail the 14th. Mississippi and South Carolina were the first U.S. states to pen Black code laws (www.blackpast.org/african-american-history/1866-mississippi-black-codes).

For example, here's what's written in “Civil Rights of Freedmen in Mississippi,” Section 6:

All contracts for labor made with freedmen, free negroes, and mulattoes for a longer period than one month shall be in writing, and in duplicate, attested and read to said freedman, free negro, or mulatto by a beat, city or county officer, or two disinterested white persons of the county in which the labor is to be performed, of which each party shall have one; and said contracts shall be taken and held as entire contracts, and if the laborer shall quit the service of the employer before the expiration of his term of service, without good cause, he shall forfeit his wages for that year up to the time of quitting.

Let's read this as a layperson, with no law degree. A few observations and questions spring to mind:

What do you mean that less than monthlong labor doesn't need to be in writing? So if you say the work is short term, then you're at the mercy of the employer without a contract. You have no documentation or avenue to contest any disagreements you may encounter with that employer. And to avoid putting your labor agreement in writing, the employer could fire you on Day 29 and then rehire you a day or two later, for instance.

For more than one month of work, police or white persons are to read the contract to the Black person. Excuse me? White people have prevented reading and writing literacy for Black people as a social control structure so that Black people must depend on white people for their livelihood. Reading the Black person's labor contract means that the white person/law enforcement officer can invent the labor agreement conditions and set the compensation rate. All of these things can be changed at any moment since the Black person, once again, is not afforded any agency or negotiation over their labor agreement.

And if a Black man quits “without good cause,” he has to return his year-to-date pay. So only Black men are included in this backward law? What if the laborer was a Black woman? Did she have any protections under these laws? We already know that “good cause” is determined by white people—yeah, the same white people who consider a Black person as a property asset counting as 3/5 of a person. And then there's how white people/law enforcement determined or regulated if a Black person was a freedman, free Negro, or mulatto: having free status validation papers. Umpteen-zillion social controls, check.

The laws, however, were in direct conflict with the 14th Amendment of the U.S. Constitution.

The first paragraph of the 14th Amendment goes as follows:

All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the state wherein they reside. No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.

Again, let's read and interpret these words as a layperson. Here's one take:

All folks born or naturalized in the United States are U.S. citizens.

U.S. citizenship is to be respected by states, and states can't infringe on that citizenship.

Every citizen should have due process of law.

Every citizen can't be denied equal protection.

These seem pretty clear. Except writing it as an amendment didn't make it enforceable. Womp, womp. The Southern states agreed to writing it; they didn't agree to follow it. The Northern states by 1890 had become weary in their charge to hold the Southern states accountable. Black men were elected to the U.S. Congress up until 1898. George Henry White of North Carolina was the last Black man elected, with his term expiring in 1901. Just 37 years. Let's call it one generation. And Black people were enslaved in the United States from 1619 to 1863: a total of 244 years. The Southern states violated all four of the 14th Amendment provisions when it came to Black people by making it harder for Black people to vote and by opposing legislation crafted by Black Congressmen. And Northern states weren't progressive in supporting Black male leadership in government given that the vast majority of Black Congressmen came from Southern states. Northern states couldn't seem to stay interested in Black progression for 1/6th of the time that Black people were (Is it really past tense? We'll get to that.) assessed, treated, and documented as property. The 14th Amendment shared words for appeasement as the actions of the Southern states smothered Black government leadership while the Northern states lacked any action to sustain Black representation and effectiveness in government.

The Rise of Jim Crow Laws

The misalignment of the Black codes and the 14th Amendment did disrupt some Northern souls. Both statutes couldn't coexist as is. Enter Jim Crow laws in Southern states. The “separate but equal” treatment was the basis of the legal arguments in the late 1890s. The same services can be provided to white people and Black people separately while maintaining the spirit of the 14th Amendment's citizenship section. It's a blatant instance of cluster analysis. As defined in machine learning, cluster analysis is the act of grouping a set of items—in this case people—such that the items in the same group (a cluster) are more similar to each other than to those in other groups (clusters). The logic and normalized operating practice then becomes that all white people are the same, all Black people are the same, all Indigenous people are the same, all Latinx people are the same, all Asian people are the same, etc. The sameness brush remains an unremovable stain on U.S. society. And for the Jim Crow laws, as in cluster analysis, considerations of how the clusters are treated after being separated aren't included.

The U.S. Supreme Court decision in Plessy v. Ferguson (1896) codified the “separate but equal” doctrine. The case stems from Homer Adolph Plessy, a Black man in New Orleans, who attempted to sit in a whites-only railway car but was denied. Plessy sued, stating that he had a right to sit wherever he chose. Now remember, only white men could serve as judges at this time. It was a 7–1 split.

Following is the majority opinion, written by Justice Henry Billings Brown:

The object of the [14th] Amendment was undoubtedly to enforce the absolute equality of the two races before the law, but in the nature of things, it could not have been intended to abolish distinctions based upon color, or to enforce social, as distinguished from political equality, or a commingling of the two races upon terms unsatisfactory to either…We consider the underlying fallacy of the plaintiff's argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction on it. — Plessy, 163 U.S. at 543–44, 551

The single dissenting opinion was written by Justice John Marshall Harlan:

Every one knows that the statute in question had its origin in the purpose, not so much to exclude white people from railroad cars occupied by blacks, as to exclude colored people from coaches occupied by or assigned to white persons…. The thing to accomplish was, under the guise of giving equal accommodation for whites and blacks, to compel the latter to keep to themselves while traveling in railroad passenger coaches. No one would be so wanting in candor as to assert the contrary. — Plessy, 163 U.S. at 557

Let me break this down and summarize:

[Brown] Race labels are kosher (informal definition meaning genuine and legitimate).

[Brown] Race labels supersede citizenship status.

[Brown] There are differences among the races. At this time, white and Black were distinguished, but it was expanded to include all nonwhite races.

[Brown] And different races don't impose a radicalized hierarchy, in which whites are positioning themselves as superior to Black people. Of course, Justice Brown said this in the reverse manner to further emphasize, using narrative, the inferiority supposition of Black people to the default white population.

[Harlan] Separation of Black people from white people is for white comfort, not equality. And that's wrong.

It doesn't take a bunch of words to say the truth, but it takes a lot of words to misdirect, misinform, and share false equivalencies. Jim Crow laws were federally mandated injustices and prejudices against Black people and advantages and profits for white people. The common focus is on the former while whispering the political, economic, and social impacts of the latter. As advantages and profits grew, the injustices and prejudices intensified. Jim Crow laws were intentionally harsh and demeaning—a reincarnation of the slave codes and Black codes of years to centuries past.

While the American Reconstruction era experienced equitable rights to Black men, Black codes and Jim Crow laws enacted social controls and monitoring on Black men, Black women, and Black children. Economic empowerment, stability, and liberation particularly bothered white people subscribing to anti-Blackness and white supremacy. To partially answer the question about Black women's work life regulations, take a look at the quoted tweet accompanying the original tweet in Figure 1.1.

Here's the text from the article:

City Ordinance Soon Be Passed Requiring Them to be Regularly Employed

MANY COMPLAINTS

Regardless of whether they want to or have to, able bodied negro women In Greenville who are not regularly employed are to be put to work, put in Jail or fined heavily. At its, special meeting yesterday afternoon City Council discussed the situation, with regard to this class of loafers at some length, and it seemed that all members of Council were agreed that steps should be taken to compel them to engage in some useful occupation. It was decided that an ordinance, similar to the one now in force requiring all able-bodied men to work at least five days per week, should be passed with regard to these women. Such an ordinance will be prepared and voted on at the next regular meeting of Council.

Figure 1.1: A quoted tweet referencing the ‘Negro Women To Be Put To Work’ article from Greenville, South Carolina in 1918

Source: Twitter, Inc. and https://twitter.com/BaelockHolmes/status/1077006002392834053

A number of complaints have come to members of Council of negro women who are not at work ana who refuse employment when it is offered them, the result being that it is exceedingly difficult for families who need cooks and laundresses to get them. Wives of colored soldiers, getting a monthly allowance from the Government, have, a number of them, declined to work on the ground that they can get along … without working, according to reports. Others have flatly refused jobs without giving any reason whatever, while still others pretend that they are employed when, as a matter or fact, they derive a living from illegitimate means.

The proposed ordinance will require them all to carry a labor identification card showing that they are regularly and usefully employed, and the labor inspectors and police will be charged with the duty of rigidly enforcing the law.

(www.newspapers.com/clip/38314573/negro-women-to-be-put-to-work)

The first sentence of the article is striking: “Regardless of whether they want to, or have to, able bodied negro women in Greenville who are not regularly employed are to be put to work, put in jail or fined heavily.”

“Class of loafers.” Excuse me? Black women are described as loafers if they're not working regularly for a white employer.

“Compel them to engage in some useful occupation.” Ha!

A prior ordinance for Black men to work five days a week is now be extended to Black women because…white people wanted to control what Black women were doing with their time and money earned not by their hands.

These statements are now recognized as deplorable. But, wait there's more! This Greenville News article is dated October 2, 1918. That's during the height of Greenville's bout with the Spanish flu pandemic. Two days later, “In the city of Greenville there were so many sick and dying residents, at least 500, that on Oct. 4, the City Council unanimously passed a resolution calling on the city's Board of Health to declare a quarantine.”

Catch this fully—Black women are being mandated to work during a global pandemic while the City Council is pleading for a quarantine mandate (which they didn't get, by the way). What, so Black women's lives are sacrificial? Their ability to have life and liberty without white interference disrupted these white folks something terrible. Yet why didn't this same compulsion apply to white women loafing at home? Because it didn't support white supremacy and anti-Blackness. White women continuing to pressure white men in political power for the right to vote was not as egregious as Black women minding their own business. The white Women's Suffrage Amendment, originally introduced in 1878, finally is passed in 1919 and ratified as the 19th Amendment in 1920 (www.crusadeforthevote.org/woman-suffrage-timeline-18401920).

Breaking Open Jim Crow Laws

The Jim Crows laws kept official hold of the United States, particularly Black people in the Southern states, until 1954. The first crack in “separate but equal” came with Brown v. Board of Education of Topeka. U.S. Supreme Court Chief Justice Earl Warren wrote the single opinion of the 9–0 unanimous decision in favor of the Brown family:

To separate [Black children] from others of similar age and qualifications solely because of their race generates a feeling of inferiority as to their status in the community that may affect their hearts and minds in a way unlikely to ever be undone. We conclude that in the field of public education the doctrine of “separate but equal” has no place. Separate educational facilities are inherently unequal. Therefore, we hold that the plaintiffs and others similarly situated for whom the actions have been brought are, by reason of the segregation complained of, deprived of the equal protection of the laws guaranteed by the 14th Amendment. — Brown, 397 U.S. at 494–5

The legal arguments presented in the case compelled the Supreme Court to acknowledge the psychological impact of the “separate but equal” doctrine to Black children. The combination of data, social context, and patterns of education injustice resulted in an indisputable argument that poked a hefty hole in the equal protection of the laws guarantee promised under the 14th Amendment. This guarantee was exclusive to Black children. Black children didn't receive equal educational access and opportunities. Brown v. Board of Education of Topeka paved the way for the Civil Rights Act of 1964, which ended legal segregation and made it illegal to discriminate on the basis of someone's race or gender in employment, schools, public spaces, and voter registration requirements. The final vote was 290–130 in the U.S. House of Representatives and 73–27 in the U.S. Senate. So 55.2 percent of the House and 63 percent of the Senate voted yea. (That means 44.8 percent of the House and 37 percent of the Senate voted nay.) It's a definite success and a huge accomplishment. It's also a bit disappointing that there wasn't a more overwhelming support for the Civil Rights Act. If these percentages were viewed as a course assessment scores, it'll be an F and a D, respectively. Political power structures are alive and well, even to this day. Inclusion goes to the heart of breaking down established power dynamics. While the vote secured the act in becoming law, it didn't alter behavior. Whereas Brown v. Board of Education and the Civil Rights Act made strides to close the access, opportunity, and eventually wealth gap, another social system had already formed wedges that restricted telecommunications and other forms of communications based on social justice ideologies.

Overt Surveillance

Targeting certain people or groups who threaten the social order of white colonialism has a long-standing presence. Overt surveillance is an early rendition of the technology we use today. Monitoring and surveillance of people dates back to 1860s once the telegraph/telephone was invented. The Federal Communications Act of 1934 created the Federal Communications Commission (FCC) to help regulate communications via telegraph, telephone, and radio. It also leaned into wiretapping as an acceptable, pro-democracy action in the interests of national security. Of course, these surveillance tactics didn't have to be disclosed to the general public, the actual people these laws should have been intended to protect. Eavesdropping via wiretap, photo surveillance, or planting an infiltrator in an organization are all effective approaches to documenting, restricting, and predicting a person's or organization's movements.3 Malcolm X, Rev. Dr. Martin Luther King, Jr., Fred Hampton, and countless others were the focus of federal and state-sanctioned investigations. Surveillance4 moved full force ahead post–Civil Rights Act. Katz v. United States added a wrinkle to the budding surveillance society as a standard operating practice:

The petitioner [Katz] has strenuously argued that the booth was a “constitutionally protected area.” The Government has maintained with equal vigor that it was not. But this effort to decide whether or not a given “area,” viewed in the abstract, is “constitutionally protected” deflects attention from the problem presented by this case. For the Fourth Amendment protects people, not places. What a person knowingly exposes to the public, even in his own home or office, is not a subject of Fourth Amendment protection. But what he seeks to preserve as private, even in an area accessible to the public, may be constitutionally protected. — Katz, 389 U.S. at 352

Simply put, no one should be wiretapped without their knowledge unless there's a threat to national security. It seems harmless enough—to protect the democracy, the U.S. government doesn't need a warrant to wiretap. But the definition of what constitutes “a threat to national security” or “protecting democracy” remains elusive and inconsistent with decisions made on a case-by-case basis. These loopholes allowed the acts, laws, and policies to be enacted, but they didn't change the government's (white people in power) practices or behaviors. This de facto set of practices, protected by acts and laws, screened and monitored Black people under the guise of them being a “threat to national security.” Democracy for all the people seemed to be continually sidestepped at every junction. Black leaders, and therefore Black people, were labeled as “threats.” Being a labeled threat triggers government-level resources to the allocated to eradicate said threat. People who are considered threats are irredeemable and criminals. Criminals are dangerous. The implicit conclusion: Black people are irredeemable, dangerous criminals.

Weaponizing Blackness, especially Black people who questioned the lawfulness of the government's acts and laws, is a regular record being played throughout the 1960s, 1970s, 1990s (Violent Crime Control and Law Enforcement Act of 1994, for example), 2000s, 2010s, and 2020s. Social restrictions and controls, like what we've seen in slave codes, Black codes, and Jim Crow laws, plague Black communities. Black people's existence is seemingly a threat—a threat worth dedicating government personnel and resources to surveilling.

Surveillance at Scale

As the overt surveillance tactics became more transparent to the public, calls for its use to be suspended grew. Surveillance countermeasures increased. The governmental response was to take surveillance underground and wide—it went digital. Overt surveillance is siloed and disjointed, with data stored in separate physical systems, paper files, photographs, and audio recordings. Temporary housing of raw data has its advantages of being decentralized to prevent all data from falling into the wrong hands. The downside is that the full breadth and influence of the surveillance goes unnoticed. To connect siloed data, computerized systems were used and identified intersections faster than people. Fighting against “threats” meant staying at least one step ahead of a potential threat's thinking, knowing, and responding to their impending activities while calling it justifiable and necessary—not targeting or cheating, at least in the eyes of the laws.

When did modern digital surveillance start? With a 1987 research paper by Sirovich and Kirby called “Low-density procedure for the characterization of human faces captures facial features using a mathematical formula” (https://opg.optica.org/josaa/abstract.cfm?URI=josaa-4-3-519).5 The patterns of white male faces are generalized based on the “average” of 115 faces. Figure 1.2 shows an example of this. Short dark hair. Dark-colored eyes. No smile. No eyewear/eyeglasses. It probably reminds you of some of the U.S. governmental-issued pics you've had to take: U.S. driver's licenses and U.S. passport photos.

The face is a form of identification that will always be unique to you, except for those with identical DNA or facial features. Creating mathematically understandable methods that isolate faces and facial features helps make algorithms easier to represent in a digital system. And these algorithms make finding a similar or the same face effortless in that digital system. Turk and Pentland (1991) extended Sirovich and Kirby's eigenvector approach using pictures of real faces in their experiments (https://doi.org/10.1162/jocn.1991.3.1.71).6 Look at the second image in Figure 1.2. Sixteen pictures. Short dark-colored hair. No smiles. Twelve with no eyewear and four with eyewear. Different complexions. The photo backgrounds are similar, with robots, posters, and/or pictures of another person. Improving the accuracy of facial recognition in images has been in pursuit ever since.

Figure 1.2: (a) Single image of face from Turk and Pentland's 1991 facial recognition published paper (b) Picture of 16 faces used in Turk and Pentland's 1991 facial recognition published paper

As of 2021, no accurate facial recognition analysis across demographic groups exist. The research matured within U.S. higher education institutions and government research laboratories. At the National Institute of Standards and Technology (NIST), face recognition technology (FERET) started in 1993, with the goal of “[developing] automatic face recognition capabilities that could be employed to assist security, intelligence, and law enforcement personnel in the performance of their duties.”7 It has evolved over the years and recently housed under Face Recognition Vendor Test (FRVT).8 In light of the COVID-19 pandemic, FRVT has a subproject, Face Mask Effects, that's quantifying face recognition accuracy for people wearing face masks. Simply stated, facial recognition algorithms make use of most facial features, but face masks obscure nose, cheeks, mouth, and chin structures. The robustness of existing algorithms to identify you using your forehead, brow, and eyes make identification very possible. For those who take pictures with face masks, facial recognition algorithms have reference data to make the correct identification with a higher probability. Spoiler alert: Facial recognition algorithms as is aren't designed for this level of identification. They'll need to throw out the forehead and brow features—too many similarities among people. They'll focus on your eyes, in particular your retinas. Your retinas are distinctive to you and make you more easily isolatable to an algorithm.

The technological advances have moved fast9 since the invention of the Apple iPhone, when an iPhone owner gained access to a camera. Data creation exploded with the pictures all of us have taken, email with and without attachments we've read, texts we've written, videos we've generated, and audio memos we've recorded. Our society has come to rely on these conveniences to move through our lives every day. We can't avoid our faces being uploaded to multiple digital systems—Department of Motor Vehicles, U.S. Customs (if you have a U.S. passport), Amazon Ring at or around your residence, traffic camera on city streets, surveillance camera in grocery and retail stores, just to name a few.

Public concern grew to coordinated action. U.S. cities and states have proposed, voted on, and instituted facial recognition laws:

Massachusetts until December 12/21, 2021, except for law enforcement (

www.mass.gov/info-details/mass-general-laws-c6-ss-220

)

California until December 12/23, 2023, prohibits California officers and deputies from using cameras for facial recognition surveillance,

https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB1215

Vermont, prohibits Vermont officers and deputies from using cameras for facial recognition surveillance without a future express consent of the legislature,

https://legislature.vermont.gov/Documents/2020/Docs/BILLS/S-0124/S-0124%20As%20Passed%20by%20Both%20House%20and%20Senate%20Unofficial.pdf

Portland, Oregon is the most restrictive by prohibiting the use of face recognition technologies by private entities in public spaces,

www.portland.gov/smart-city-pdx/news/2020/9/9/city-council-approves-ordinances-banning-use-face-recognition

See www.banfacialrecognition.com/map for a recent list.10

The U.S. Congress has proposed several bills, some of which are listed here, but none have made it to the U.S. Senate for a vote as of this writing:

H.R. 2231 – Algorithmic Accountability Act of 2019

H.R. 3230 – Defending Each and Every Person from False Appearances by Keeping Exploitation Subject (DEEPFAKES) to Accountability Act of 2019

S.3284 – Ethical Use of Facial Recognition Act of 2020

S.4084 – Facial Recognition and Biometric Technology Moratorium Act of 2020

The Science

Laws, policies, and their evolution over the past 200+ years has moved the United States society forward…slowly. These laws, however, leave gaps for people to interpret the policy language in ways that uphold systemic inequities and prejudices. Uneven policy and law regulation and policy enforcement practices exacerbate discrimination. It further deepens harms disproportionately to certain communities, and sidesteps the necessary atonement for healing. Perhaps laws and policies are too nebulous to be helpful in mitigating discrimination.