Technology Ethics - Steven Umbrello - E-Book

Technology Ethics E-Book

Steven Umbrello

0,0
15,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Technologies cannot simply be understood as neutral tools or instruments; they embody the values of their creators and may unconsciously reinforce existing inequalities and biases.

Technology Ethics shows how responsible innovation can be achieved. Demonstrating how design and philosophy converge, the book delves into the intricate narratives that shape our understanding of technology – from instrumentalist views to social constructivism. Yet, at its core, it champions interactionalism as the most promising and responsible narrative. Through compelling examples and actionable tools this book unravels the nuances of these philosophical positions, and is tailored to foster responsible innovation and thoughtful design. As our everyday lives further intertwine with technology, understanding and implementing these design principles becomes not just beneficial, but essential.

This concise and accessible introduction is essential reading for students and scholars of philosophy of technology, engineering ethics, science and technology studies, and human–machine communication, as well as policymakers.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 176

Veröffentlichungsjahr: 2024

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Title Page

Copyright Page

Acknowledgments

1 Technology and Society

This Book

Notes

2 Instrumentalism

Technologies as Tools

Guns

Only Kinda Neutral

Down but Not Out

The Departure of Boromir

Notes

3 Technological Determinism

The Inevitable March of Progress

Flavors of Technological Determinism

From the Stirrup to Mass Media

Critiques of Technological Determinism

The Social Construction of Technology

Notes

4 Social Constructivism

Technologies are What We Make Them

The Principle of Symmetry

Not Just a Theory

Some Pitfalls

Thinking with Fiction

Notes

5 The Design Turn

Our Dwellings Shape Us

Who’s Responsible?

The Design Turn

Choices, Nudges, and Architectures

Interactionalism

Notes

6 Responsible Innovation

Embodied in Consumer Movements

Talking with Stakeholders

When to Responsibly Innovate

Values and Design

Moral Overload

Notes

7 Approaches to Ethics by Design

Designing with Ethics for Ethics

Universal Design

Participatory Design

Human-Centered Design

Value Sensitive Design

Values and Preferences

Notes

8 Ethics by Design in Action

The Ethical Engineer’s Toolbox

Who are the Stakeholders?

What are the Values?

How Do we Use Values?

Stakeholders, Time, Values, and the Pervasiveness of Technology

Always Coming Back

Notes

9 Our Common Future with Technology

Technological Ethics in Practice

Progress, not Perfection

Glossary

Artificial Intelligence (AI)

Direct Stakeholders

Ethics by Design

Framing Effects

Human-Centered Design

Indirect Stakeholders

Neutrality Thesis

Participatory Design

Universal Design

Value Sensitive Design

Further Reading

References

Index

End User License Agreement

Guide

Cover

Table of Contents

Begin Reading

Pages

iii

iv

vii

viii

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

Technology Ethics

Responsible Innovation and Design Strategies

Steven Umbrello

polity

Copyright Page

Copyright © Steven Umbrello 2024

The right of Steven Umbrello to be identified as Author of this Work has been asserted in accordance with the UK Copyright, Designs and Patents Act 1988.

First published in 2024 by Polity Press

Polity Press

65 Bridge Street

Cambridge CB2 1UR, UK

Polity Press

111 River Street

Hoboken, NJ 07030, USA

All rights reserved. Except for the quotation of short passages for the purpose of criticism and review, no part of this publication may be reproduced, stored in a retrieval system or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the publisher.

ISBN-13: 978-1-5095-6404-0 (hardback)

ISBN-13: 978-1-5095-6405-7(paperback)

A catalogue record for this book is available from the British Library.

Library of Congress Control Number: 2023923622

by Fakenham Prepress Solutions, Fakenham, Norfolk NR21 8NL

The publisher has used its best endeavors to ensure that the URLs for external websites referred to in this book are correct and active at the time of going to press. However, the publisher has no responsibility for the websites and can make no guarantee that a site will remain live or that the content is or will remain appropriate.

Every effort has been made to trace all copyright holders, but if any have been overlooked the publisher will be pleased to include any necessary credits in any subsequent reprint or edition.

For further information on Polity, visit our website: politybooks.com

Acknowledgments

For almost a decade, I have been interested in and working on the philosophy and ethics of technology. This book not only draws on that experience but it aims to demonstrate how most of the problems that we tend to fixate on concerning our technologies can be reframed if we look at them from the perspective of design. I cannot do justice to all of the colleagues with whom I have had numerous conversations, which have culminated in the work you see here. In many ways, this work is as much theirs as it is mine. Still, it merits mentioning those who provided me with the initial inspiration to work on this topic. Firstly, this project simply could not have been possible without the generous financial support of the Institute for Ethics and Emerging Technologies, which, for nigh a decade, has been increasingly supportive of my work. Likewise, I want to extend my thanks to my dear colleagues at the University of Turin, in particular Graziano Lingua, Antonio Lucci, and Luca Lo Sapio, who have been sources not only of support but also of dear friendship. Mary Savigar and all the production staff at Polity have been invaluable in bringing this volume to fruition. Finally, I would also like sincerely to thank Nathan G. Wood for his impeccable editing skills, which have led to this polished volume.

1Technology and Society

In 1980 Langdon Winner published what would become a foundational work in the burgeoning field of philosophy of technology. In his paper, “Do Artifacts Have Politics?”, Winner described how the parkways of Long Island, New York were built intentionally low (Winner 1980). The reason for this was that Robert Moses, the American urban planner responsible for planning much of New York’s metropolitan area throughout the early and mid twentieth century, purposefully designed the parkways low to ensure that poor and lower-middle-class families (mostly African Americans and other minority groups) could not access Jones Beach, one of his prized strands. Moses knew that these groups had limited access to cars and relied on public transit, and those low-hanging parkways could not accommodate tall city buses. The parkways thus created an infrastructural barrier limiting access to Long Island’s beaches to only those who could afford cars (Caro 1975). Moses’ racist values were thereby embodied in the technology, low-tech as it may be, of the parkways, and this is, in fact, exactly what Winner showed, that technologies are not merely tools, but that they embody values.

Since Winner’s work, philosophy of technology has come a long way, and it is now standard to view technologies not as isolated artifacts, but as infrastructures, systems, or, more specifically, as sociotechnical systems. But what exactly does that mean? What does it mean to understand technology as somehow being “sociotechnical”? In both academic and everyday circles, people generally talk about technology in (at least) one of three ways. The first is to conceive of technology purely as a tool or instrument. Usually referred to as instrumentalism, such views are often pushed by those who wish to tout the benefits of a given technology while downplaying possible negatives. A notable exemplar is the oft-quoted motto of American gun rights activists; “guns don’t kill people; people kill people.” The second way to construe technology is as being purely deterministic. This position, <ism, holds that both human action and our social world are determined by technology, a view nicely illustrated in the popular cyberpunk video game Deus Ex: Mankind Divided, where the hashtag #CantKillProgress is repeatedly used to show there is no way to stop the inevitable march of technology and its societal consequences (Deus Ex 2011). The third way of looking at technology is to understand it as socially constructed. This position, known as social constructivism, sees technology as being nothing other than the product of human actions; humans, therefore, are completely responsible for what technologies are and what they do. Each of these narratives sees continual propagation in both popular culture and academia, but do they accurately capture what technologies really are?

Robert Moses’ bridges show that technologies can both instantiate values and be shaped by them. Moreover, technological limitations can impact how values are embodied in technologies and may alter the very values themselves; interaction effects may stack, interfere with one another, or shift the course of design. All in all, it seems plain that technology is not as simple as any of the single above conceptions would have us believe. Rather, sociotechnicity is a rich yet complex topic in constant development, referring to the dynamic interaction between technologies and people, which form a complex infrastructure (Ruth and Goessling-Reisemann 2019). This means that technologies are not isolated objects. Instead, they are connected systems, part of a larger network of other technologies and people. This sociotechnical understanding of technology highlights a combination of instrumentalism, and social constructivism, and represents what some scholars call interactionalism. Fundamental to interactionalism is the understanding that technologies are in constant and dynamic interaction with other technologies and people.

It may go without saying, but it is also worthwhile to make clear, that technologies provide us with a host of benefits, and we should not automatically assume that all technologies embody disvalues like Moses’ racism in his bridges. That example is used to demonstrate that technologies are characterized by the values that they embody and that those values have material impacts on the world and our future alongside them. However, as the world changes, those impacts may change as well; as cars became more affordable, those groups Moses hoped to keep out became more and more able to pass under his parkways and access Long Island’s beaches. How a technology embodies a value, therefore, changes over time. This further illustrates how technologies are interactional,1 part of a larger environment of relationships with people and other technologies. Each technology is sure to be designed for an explicit purpose, but they will also interact with other technologies, forming a network of shifting relationships which is important to fully understand if we are to ensure that we design our technologies for good.

Focusing on the values behind development can also be crucial for identifying when a design is failing to fully live up to those values. As an example, artificial intelligence (AI) technologies can illustrate in distressing clarity what can happen when core human values are not clearly and explicitly designed for (Coeckelbergh 2020). For example, IBM spent $62 million USD to develop their famed Watson AI system to help provide medical doctors with cancer treatment advice (Ross and Swetlitz 2018). However, when tested in real-world settings, the system often recommended “unsafe and incorrect” cancer treatments, such as recommending medications that would aggravate, rather than help patients with serious bleeding. Because the data used to train the system was mostly hypothetical, rather than real, the system made poor recommendations. Documents revealed that the decision to use hypothetical clinical scenarios rather than the statistical data of real patients and cases was the consequence of training the system according to the preferences of doctors rather than the Big Data that is available in healthcare, presumably in order for the designers to quickly implement the system. Accuracy and safety were obviously not the values explicitly designed for in this system, leading to potentially lethal consequences. There are, moreover, numerous examples where systems have, as a function of design, not only made errors but reinforced existing problems. This is what happens when technologies are not approached from an applied ethics perspective, when we do not look at them as interactional, paying heed to how their various facets impact on one another. Good intentions are not enough; good design is better.

This Book

Technologies, arguably, are an inextricable part of what characterizes human beings, and they are certainly here to stay. Likewise, we are currently experiencing an almost dizzying boom in information and communication (ICT) technologies and artificial intelligence systems that are increasingly difficult to understand (Ihde and Malafouris 2018). If technologies embody the values of their creators, whether they intend to embody them or not, that means that we exert a degree of control over how those technologies impact on our world and the future. This is a hopeful prospect.

This book explores the nuances of how our different sociotechnical systems, systems we often overlook and take for granted, influence and are influenced by our actions. It aims to give the reader a clear overview of how technological design has been traditionally handled, how and why philosophy has become so important in design, as well as the various approaches for actually doing the dirty work now so that we don’t suffer the consequences later. More broadly, this book will introduce philosophical concepts and positions as they relate to how we understand technologies and our relationship with them, while also showing how important it is for engineering ethics that we have an accurate and holistic understanding of technology.

Towards this end, this book will explore some of the main historical and current views of technology, as well as connect philosophical concepts to practical applications. This will help guide readers in understanding the importance of engineering ethics, that is, understanding and promoting the ethical practice of engineers (Harris et al. 2013). Given the ubiquity of technologies in our hyperconnected world, and given the role that engineers play in the creation of those technologies, understanding and promoting engineering ethics is an important goal. Doing so requires people from various disciplines and fields like philosophy, public policy, and, of course, engineering, to come together. Huge investments at regional levels, like that of the European Union, moreover demonstrate the overall interest in promoting this practice.

Focusing on what technology is and what engineers can do to ensure that technologies are designed and developed ethically means that we can focus more on pressing real-world issues that come as part and parcel of technologies and less on the techno-utopian or techno-dystopian narratives that have been dominant in both public as well as academic spaces. Many scholars that have directed their energy toward engineering ethics have found that those latter hyperbolic debates often come at the opportunity cost of more proximal issues that contemporary technologies present and that need immediate attention, like unemployment as a consequence of automation or issues of data privacy. That, of course, does not mean that thinking about the long-term future should be totally sidelined, as doing so would risk missing the forest for the trees, thinking too narrowly when designing and introducing new technologies into the world. For this reason, in the last chapter, I discuss engineering ethics as a multi-generational project, thinking about design as something that necessarily transcends individual lifespans.

Likewise, it is valuable, philosophically speaking, to consider narratives that are beyond those of exclusively extant technologies. Science fiction and fantasy provide us with modalities of understanding the complexities of the world in novel and complex ways. In his book, New Romantic Cyborgs, Mark Coeckelbergh explores how science fiction narratives concerning robots impact and influence how we actually perceive such technologies (Coeckelbergh 2017). In this book, I regularly draw on fictional analogies and narratives, such as from the works of J.R.R. Tolkien, to help tease out what would otherwise be complex and nuanced philosophical positions as they relate to how we can properly understand what technologies are.

For example, the Ring of Power in Tolkien’s lore is a testament to the creator’s will, embodying the immense influence it holds over both its lesser counterparts and their bearers.

Now the Elves made many rings; but secretly Sauron made One Ring to rule all the others, and their power was bound up with it, to be subject wholly to it and to last only so long as it too should last. And much of the strength and will of Sauron passed into that One Ring; for the power of the Elven-rings was very great, and that which should govern them must be a thing of surpassing potency; and Sauron forged it in the Mountain of Fire in the Land of Shadow. And while he wore the One Ring he could perceive all the things that were done by means of the lesser rings, and he could see and govern the very thoughts of those that wore them. (Tolkien 2007, 265)

Such a narrative device reflects the potential of technology to not only command over other creations but to embed itself into the very fabric of societal functioning, influencing actions and decisions. For example, the 1999 film, The Matrix, can be a vehicle to help us think about how modern media technologies can determine and cause cultural change.

For the purposes of this book, these narratives can help us follow how different characters view technologies and how that impacts on their choices and actions. How do the different characters in Tolkien’s magnum opus, The Lord of the Rings, view the One Ring? Does it determine their behavior? Can it be controlled and used towards good ends? As modern scholars of technology have argued, the various ways of conceptualizing the same technology can lead to radically different outcomes. For engineers and designers who have yet to take applied ethics seriously in their day-to-day work, drawing on a narrative like that of The Lordof the Rings may provide a more nuanced point of departure for navigating away from simplistic and facile ways of understanding what technology is and towards a more complex and nuanced understanding that can address many of the difficult and seemingly intractable issues we face today.

What happens when we take a more nuanced understanding of technology that moves beyond the simplistic narratives of instrumentalism, determinism, and constructivism?2 Viewing technology as something that is inextricably co-constitutive of everyday human life, but one which can be guided and directed, means that many of the ethical issues resulting from technological development can (potentially) be addressed by conscious design. This means that not only can engineering ethics be informed by various more foundational ethical perspectives and concerns, but that engineering ethics can likewise inform ethics itself, with each providing impetus for development in the other. What we’ll see throughout this book is that each of the traditional ways of understanding technology misses something. Instrumentalism misses values; determinism misses humans; and constructivism misses how technologies impact on our lives. Each of the philosophical positions presented in this book has its merits, but also its shortfalls. One of the threads that weave this book together is that technology is best understood as embodying values; tools influence our behaviors, and their design is influenced by our actions. Taking this as a point of departure, engineers can better understand how their decisions impact on ethical issues and reflect on how to mitigate any unwanted consequences. Philosophers can look at how technologies bring to light novel and unforeseen ethical values. Together, they may ensure that emerging technologies bolster our most fundamental and sacred human values.