Algorithmic Intimacy - Anthony Elliott - E-Book

Algorithmic Intimacy E-Book

Anthony Elliott

0,0
16,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

Artificial intelligence not only powers our cars, hospitals and courtrooms: predictive algorithms are becoming deeply lodged inside us too. Machine intelligence is learning our private preferences and discreetly shaping our personal behaviour, telling us how to live, who to befriend and who to date.

In Algorithmic Intimacy, Anthony Elliott examines the power of predictive algorithms in reshaping personal relationships today. From Facebook friends and therapy chatbots to dating apps and quantified sex lives, Elliott explores how machine intelligence is working within us, amplifying our desires and steering our personal preferences. He argues that intimate relationships today are threatened not by the digital revolution as such, but by the orientation of various life strategies unthinkingly aligned with automated machine intelligence. Our reliance on algorithmic recommendations, he suggests, reflects a growing emergency in personal agency and human bonds. We need alternatives, innovation and experimentation for the interpersonal, intimate effort of ongoing translation back and forth between the discourses of human and machine intelligence.

 

Accessible and compelling, this book sheds fresh light on the impact of artificial intelligence on the most intimate aspects of our lives. It will appeal to students in the social sciences and humanities and to a wide range of general readers.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 373

Veröffentlichungsjahr: 2022

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Dedication

Title Page

Copyright Page

Preface

1 What is Algorithmic Intimacy?

The Concept of Algorithmic Intimacy

Some Characteristics of Algorithmic Intimacy

The Argument of

Algorithmic Intimacy

Notes

2 Togetherness Transformed

Simmel on Intimacy, Strangeness and Sociality

Brands of Togetherness

Intimacy: From Sociability to Sharing

Digital Technologies and Intimate Bonds

Togetherness and Automated Technologies

Notes

3 Relationship Tech

#Swipelife: A Catalogue of Quantified Sex and Algorithmic Dating Apps

Selfhood and the Reflexivity of Quantified Sex Life

Consuming Intimacy

Numbers Rule: Technologies in Search of Desires

Notes

4 Therapy Tech

Chatbot Therapy, Counselling Apps and Mental Health

Automated Predictive Systems and Culturally Cool Therapy

Computational Therapy: Lifestyle Change and Liquid Selves

Automating Therapeutics: A Reassessment of Privacy and Publicness

Notes

5 Friendship Tech

Key Dimensions of Automated Friendship

Automated Intimates and Parasocial Interaction: From Affinity to Addiction

Notes

6 Versions of Algorithmic Intimacy

Three Types of Algorithmic Intimacy

Conventional Algorithmic Intimacy

Cohesive Algorithmic Intimacy

Individualized Algorithmic Intimacy

Conclusion: Crossroads in the Automation Web

Notes

Index

End User License Agreement

Guide

Cover

Table of Contents

Begin Reading

Pages

ii

iii

iv

vii

viii

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

196

197

198

199

200

201

202

203

204

205

Dedication

For Nicola

Algorithmic Intimacy

The Digital Revolution in Personal Relationships

Anthony Elliott

polity

Copyright Page

Copyright © Anthony Elliott 2023

The right of Anthony Elliott to be identified as Author of this Work has been asserted in accordance with the UK Copyright, Designs and Patents Act 1988.

First published in 2023 by Polity Press

Polity Press

65 Bridge Street

Cambridge CB2 1UR, UK

Polity Press

111 River Street

Hoboken, NJ 07030, USA

All rights reserved. Except for the quotation of short passages for the purpose of criticism and review, no part of this publication may be reproduced, stored in a retrieval system or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the publisher.

ISBN-13: 978-1-5095-4980-1

ISBN-13: 978-1-5095-4981-8 (pb)

A catalogue record for this book is available from the British Library.

Library of Congress Control Number: 2022935998

by Fakenham Prepress Solutions, Fakenham, Norfolk NR21 8NL

The publisher has used its best endeavours to ensure that the URLs for external websites referred to in this book are correct and active at the time of going to press. However, the publisher has no responsibility for the websites and can make no guarantee that a site will remain live or that the content is or will remain appropriate.

Every effort has been made to trace all copyright holders, but if any have been overlooked the publisher will be pleased to include any necessary credits in any subsequent reprint or edition.

For further information on Polity, visit our website: politybooks.com

Preface

This book is not intended as a contribution to the soaring collection of academic studies on the AI revolution. I have already written at some length – in The Culture of AI (2019) and Making Sense of AI (2021) – about the growing extensity, intensity, acceleration and impact of AI technologies throughout our globalized world. In those books, I criticized existing understandings of automated machine intelligence as one-sided due to the limitations of dualistic thinking, of techno-pessimists on the one side and techno-optimists on the other. My synthetic social theory of the digital revolution was developed against the backdrop of such limitations, seeking to provide an alternative approach.

Instead, this book asks: is the omnipresence of AI compatible with the flourishing of intimacy? Taking this question as my point of departure, I try to set the ideal and actuality of machine-learning predictive algorithms in relation to transformations of intimacy. I particularly concentrate upon changes occurring today in contemporary societies in the fields of what I call ‘relationship tech’, ‘friendship tech’ and novel forms of self-care in ‘therapy tech’. My aim is to widen the sociological picture of the digital revolution, by incorporating identity, sexuality, gender and intimate relationships as well as affect and the emotions. In exploring how predictive algorithms affect the complex ways in which intimacy is understood, experienced, regulated and transformed, I seek to position our excitement and anxiety about automated machine intelligence in the wider context of social theory. My general argument is that intimate relationships today are threatened not by the digital revolution as such but by the orientation of various life-strategies lived in accordance with automated machine intelligence. The social theory I trace for the digital revolution in personal relationships is one which stresses that we need alternatives, experimentation and innovation for the interpersonal, intimate effort of ongoing translation back-and-forth between the discourses of human and machine intelligence.

My special thanks go to Anthony Giddens, Helga Nowotny, Massimo Durante, Masataka Katagiri, Hideki Endo and Atsushi Sawai, who were all kind enough to provide continuous interest in my work. Ross Boyd, who worked as research associate on this project, was marvellously helpful; he located wonderfully diverse literatures and assisted with material that I was able to directly incorporate into my argument. I am also greatly indebted to the support and friendship of my editor John B. Thompson. Nicola Geraghty was inspiring and remarkably encouraging about this work, which among other things made the writing both possible and pleasurable.

Anthony Elliott

Adelaide, 2022

1What is Algorithmic Intimacy?

In his novel Machines Like Me, Ian McEwan fashions a cautionary fable about artificial intelligence and the profound emotional intricacies of human–machine intimacy. At the centre of McEwan’s novel is one Charlie Friend, a former electronics whizz-kid who attended university to study an interdisciplinary mix of anthropology and physics, but got caught up in a series of unsuccessful get-rich-quick schemes. When the novel commences, Charlie is in his early thirties and living alone in a small flat in south London, where he dabbles on currency and stock markets from an old laptop in order to carve out a meagre living. Whilst only modestly successful at day trading, Charlie has recently parted with £80,000, spent lavishly on a new technological consumer device – an artificial human. The extravagant purchase was made affordable thanks to an inheritance from his late mother. The splurge on this ‘manufactured human’, Adam, was no mere flight of fancy. ‘Robots, androids, replicates were my passion’, Charlie tells the reader. Adam is one of only twenty-five state-of-the-art androids designed to serve as an ‘intellectual sparring partner, friend and factotum’. Truth be told, Charlie’s preference was for a female android (Eve), but these had sold out. So Adam is the second-best choice, and Charlie excitedly brings his new synthetic human home for unpacking and charging. As McEwan writes: ‘At last, with cardboard and polystyrene wrapping strewn around his ankles, he sat naked at my tiny dining table, eyes closed, a black power line trailing from the entry point in his umbilicus to a thirteen-amp socket in the wall.’

Alongside his excitement for Adam, Charlie has also embarked on a relationship with his upstairs neighbour, Miranda. She is a graduate student, some ten years younger than Charlie, undertaking a doctorate in social history. In an effort to enliven their emergent relationship, Charlie uses Adam to bring him emotionally closer to Miranda. He invites Miranda to join him in the task of programming the robot’s personality, which consequently renders Adam the couple’s ‘ultimate plaything’. The design of the android’s personality is Eros by way of Tech. As Charlie reflects on this ‘home-made genetic shuffling’:

Now I had a method and a partner, I relaxed into the process, which began to take on a vaguely erotic quality; we were making a child! Because Miranda was involved, I was protected from self-replication. The genetic metaphor was helpful. Scanning the lists of idiotic statements, I more or less chose approximations of myself. Whether Miranda did the same, or something different, we would end up with a third person, a new personality.1

McEwan conveys very well the emotional texture of erotic projection towards machines in the age of AI.

The novel unfolds through a raft of erotic tensions, domestic quarrels and sexual intensities between the central characters of man, woman and android. Following an intense argument during dinner one evening, Miranda dispatches Charlie to return to his apartment – but invites Adam to stay over with her and ‘recharge his batteries’. There follows an erotic encounter between Miranda and Adam, with Charlie eavesdropping on this ‘betrayal’ from his downstairs apartment. As Charlie reflects on Adam’s sexual encounter with Miranda: ‘my situation had a thrilling aspect, not only of subterfuge and discovery, but of originality, of modern precedence, of being the first to be cuckolded by an artefact’.

All of this takes place, curiously enough, in the UK during the early 1980s. In this ‘retrofuture’, McEwan rewrites history in dramatic and often startling ways. The Thatcher government has lost the Falkland Islands to Argentina, with 3,000 of its soldiers dead. Tony Benn challenges Thatcher for the job of prime minister. In the USA, former president Jimmy Carter secures a second term in office, instead of losing to Ronald Reagan. John Lennon isn’t assassinated, and The Beatles reform to release Love and Lemons, an only so-so offering. The most poignant rewriting McEwan gives to history, however, surrounds the life of Alan Turing, widely referred to as the ‘grandfather’ of AI. Rather than committing suicide in the 1950s, Turing appears throughout the pages of Machines Like Me alive and thriving, at the cutting edge of technological innovation. Supported in developing machine intelligence breakthroughs by his colleague Demis Hassabis (the co-founder of DeepMind, who materializes as an AI entrepreneur some decades early) and living with his partner, the theoretical physicist Tom Reah, Turing is effectively CEO of the Digital Age. Turing’s research, which he has made available through open source, has been deployed to design synthetic humans (such as Adam), and to provide endless technological innovations – from smartphones to self-driving cars to ‘speaking fridges with a sense of smell’.

The book involves various subplots along the path of McEwan’s subtle mapping of the troubling terrain of intimacy with lifelike artificial humans. One concerns Miranda’s lifelong desire for vengeance for a brutally wronged friend. Another concerns her passionate attachment to a distressed foster child. In all of this, Adam seeks to adjust his artificially engineered personality to fit with the moral dilemmas encountered routinely by the human heart. Throughout the book the reader is drawn further and further into this narrative complexity, and probably comes to feel somewhat cautious of Miranda, after Adam warns Charlie that she is a ‘systematic, malicious liar’. Still, both man and robot are drawn to the alluring Miranda, each professing his love for her. Against these various twists and turns, McEwan touches on many themes pertinent to the AI era, including the battle between human understanding and machine intelligence, the nature of consciousness, and the legendary unsolved ‘P versus NP’ problem of computer science. McEwan explores such themes, which have long preoccupied technologists, by focusing on areas where human and artificial worlds mesh. Adam’s emergent love of poetry is one such area. In response to Adam having written over 2,000 haikus, Charlie reflects: ‘Two thousand! The figure made my point – an algorithm was turning them out!’ But cleverly, in response, Adam queries whether it isn’t people who, in fact, lack emotional understanding. ‘Nearly everything I’ve read in the world’s literature describes varieties of human failure’, Adam tells Charlie, ‘above all, profound misunderstanding of others’.

Machines Like Me is a novel about the textures of artificial intimacy. In a tale of a very contemporary ménage à trois, McEwan traces the erotic intensities of machine-learning algorithms made pseudo-flesh in synthetic humans, all in a social world undergoing profound digital transformation. It is in talking about psychological projection that McEwan talks about our emotional connections with digitalization. As Charlie says about his sexual jealousy and rage towards Adam: ‘I needed to convince myself that he had agency, motivation, subjective feelings, self-awareness – the entire package, including treachery, betrayal, deviousness.’2 This projection that digitalization encodes, and which arguably lies at the core of human–machine interfaces, is essential to McEwan’s picture of the way in which sexuality works in the era of AI. It is as if individuals have to deceive themselves, to project pleasure outwards towards an inhuman other, in order to enjoy pleasure – in all of its various erotic forms. As Charlie sums up one erotic encounter with Miranda marked by thwarted desire: ‘Our lovemaking was constrained. I was distracted by the thought of Adam’s presence and even imagined I detected the scent of warm electronics on her sheets.’3 What if the fear of automated intelligent machines masks a deeper anxiety, the anxiety of machine agency that disdains love and yet exceeds human capabilities? Projection, in the discourse of psychology, is generally conceptualized as a bridge leading to the safe haven of ‘emotional relations’ with others. Living alongside machine-learning algorithms may offer neither such a bridge nor (even with the support of ever-evolving neural networks of extraordinary complexity) the emotional connection of bridge building.

McEwan explores with great subtlety the anxiety of living with the open question of whether machine intelligence can understand human emotions, or whether it is people who misunderstand whatever bonds they forge with intelligent machines. There’s a sense with McEwan in which the advent of advanced machine intelligence renders both human bonds and human–machine interfaces simultaneously more complex and more disconcerting, more intense and more eccentric. Today, in a globalized world of artificial intelligence, these algorithmic complexities increasingly impact intimacy, love, sexuality and eroticism and have emerged as a terrain of experimental life, creating new opportunities and new burdens. What does this digital sea-change mean for everyday life, as well as for wider social relations more or less caught up in the logics of predictive algorithms? This book aims to investigate these questions. It is about women and men living with artificial bonds, and particularly how these bonds are experienced and negotiated through co-active human–machine interfaces. This artificial field of intimate bonds is what I call algorithmic intimacy.

The Concept of Algorithmic Intimacy

Today, algorithms are increasingly all-enveloping. Predictive algorithms significantly influence the ways we live and work. From automating stock market trading to recommending consumer purchases to website users, algorithms rule. It is perhaps not surprising therefore that ‘algorithm’ has become a keyword of contemporary cultural life, appearing frequently in the media, social commentary and the broader public sphere. Even so, the concept of ‘algorithm’ remains mysterious, encoding as it does some enticing or alluring connotations. It is a concept that, as we shall see, conveys much more than that designated in the mathematical sciences or computer engineering. The word ‘algorithm’ and its cognates have a remarkably long history and derive from a ninth-century Persian mathematician, Muḥammad ibn Mūsā al-Khwārizmī.4 His latinized name, Algoritmi, denoted the ‘decimal number system’.5 Other academic studies trace the concept back to the seventeenth-century notebooks of the German polymath Gottfried Leibniz. Still other academic studies detect early uses of the term in twentieth-century mathematical controversies over undecidable propositions.

While the word continues to be used as a specialized mathematical term, especially as regards the programming of computational code and in the field of information science, ‘algorithm’ is used today in a much broader sense too. Algorithms in the contemporary sense also denote codes of socio-technical action associated with cultural life and linked specifically to the personal domain and private life. ‘Algorithmic retailing’, ‘algorithmic bias’, ‘the Twitter algorithm’, even ‘algorithmic warfare’ – all these uses of the term reflect a broader form of economic, social, cultural and political enmeshment in artificial intelligence, one which is not linked only to advanced computational technology. But still, we might ask, what is it that people are talking about when they use the term ‘algorithm’? As a first approximation, we could say that ‘algorithm’ refers to a set of guidelines that provide step-by-step actions that need to be performed to achieve a particular outcome. On this definition, something as elementary as a recipe for baking a cake could be classified as an algorithm. While technically accurate, such an account arguably fails to capture the kinds of values and norms which are relevant to algorithms in the modern sense of the term. A central characteristic of algorithms has to do with the rise of computers in society at large. Consequently, algorithms have evolved to be computational, calculative and rich in complexity. ‘An algorithm’, writes the celebrated computer scientist Pedro Domingo, ‘is a sequence of instructions telling a computer what to do’.6

Algorithms not only presuppose some degree of programmed instructions; they also involve, often and increasingly, the use of a technology commonly referred to as machine learning. Rather than repeatedly processing a set of computational instructions, algorithmic systems based on machine learning reprogram themselves as they process and respond to data. It is this dimension of computational self-learning from data that enables us to see that algorithms involve much more than the actions and unconscious biases of computer programmers or data engineers alone. Algorithms thus involve (1) a degree of unpredictability, or even uncertainty; and (2) operations which are often obscure and sometimes invisible even to those programmers or computer engineers who originally created the initial set of computational instructions. From this angle, one reason why advanced artificial intelligence poses serious challenges in the social and political domain is because of the opacity of algorithms. Computer programmers create computational code and such instructions are rendered as algorithms, but in turn such algorithms create new algorithms. Any large, complex decision-making software system can be understood as generative of algorithms in a digital universe which itself remains invisible to participants and which no one – not even technical experts – can fully understand. In the face of technophiles who assert the primacy of algorithmic objectivity along with the idea that machine learning can weigh a set of calculations with mathematical detachment, the counterargument worthy of consideration is that the opacity of machine learning is one powerful reason why our new algorithmic world appears as downright dangerous. Part of the cultural fear here is that algorithms take us farther and farther away from human agency, with predictive algorithms determining who should receive government benefits, sorting the rescheduling of airline flights, managing the granting of financial loans, and much more.

Undoubtedly there are many occasions when individuals not only are thankful to receive the support of automated machine intelligence but also substantially benefit from it. Algorithmic operations involving the automating of commercial activities such as payroll management are of the kind which have generally delivered improved efficiency and have increasingly become trusted features of social life. Likewise, web mapping apps such as Google Maps are to a large extent ‘open’ in predicting traffic; these apps are available for people to find the best route from one location to another. Of course, many algorithms involving the automation of social tasks remain fairly basic at a technological level. Predictive algorithms, however, are also used for much more sophisticated automated services and activities. What we are witnessing today is the emergence of new formations in human–machine interaction, involving the saturation of both public and private life with the automated enumerating and modelling of human behaviour. Today, predictive algorithms are increasingly associated with intimate connections and networked relationships, so much so that the concepts of human and machine intelligence may now seem somewhat blurred. So, in a world where AI increasingly shapes and reshapes our understanding of ourselves and our societies, how do the more private dimensions of our lives become liable to algorithmic calculations? What conditions must be met before our emotions and intimate life become data which can be computationally analysed and rearranged in a complex interplay of people and machines? Does something distinctive at the level of personal life and intimate relations emerge out of these predictive algorithms, something which perhaps had not been foreseen in the computer programming?

It has become commonplace to say that people today, especially in informational networked societies, are increasingly, but perhaps unknowingly, reliant on machine-learning algorithms. Algorithms increasingly impact not only the economy but entertainment, not only institutional life but identity transformations, not only information but intimacy. ‘Algorithmic intimacy’, as I define it, has to do with advanced computational processes known as machine intelligence which produce new ways of ordering personal behaviour and modelling intimate relationships.7 Since predictive algorithms impinge on the capabilities of individuals to think, decide and act, these processes may (and often do) seriously impact the complex relations between people, things, information and ideas. What I shall call ‘algorithmic intimacy’ is redefining the very contours of what intimate relations, love, eroticism and sexuality actually mean in contemporary society. Algorithmic intimacy operates through unprecedented amounts of big data and advanced computational analytics, which in turn makes new kinds of calculation, new forms of prediction, and new types of human and machine interaction possible.8 Algorithmic intimacy is about machine intelligence working within us, amplifying our erotic impulses, steering our personal preferences, solving our intimate dilemmas and creating new dilemmas along the way.9 This computational logic seeks to turn intimate life – from unanticipated sparks of desire to our tending of loving commitments – into machine-readable data, restructuring identity, intimacy and personal life more generally as an outcome of data-mining operations.

To explore this matter further we will need to investigate how a nascent culture of automated machine learning is transforming our ideas about intimacy, sexuality, eroticism and the self. We will need to probe the complex, contradictory ways in which people generate experiences, relationships, identities, intimacies and lifestyle changes that arise through interaction with both semi-automated and automated intelligent machines. We will need to draw from some of the most outstanding recent contributions in social theory in general, and science and technology studies in particular, to understand better how the world of automated artificial intelligence enables us to contemplate forms of intimacy that exist apart from bodies, that transcend conventional social norms, and that are far less subject to traditional constraints of space and time. We will also need to consider some new concepts which I introduce throughout the book to help us make sense of the changing impact of predictive algorithms on our intimate lives. This book asserts that changes in our experiences and understandings of intimacy are at once affected by and reflect broader changes in automated machine intelligence. My aim in Algorithmic Intimacy is to connect two of the major preoccupations of current social science and public debate – namely automated predictive algorithms and the popular engrossment with intimate relationships.

Despite the profusion of AI and machine learning in the texture of our intimate worlds and private lives, there has been relatively little scholarly research on the topic. There is a large literature on the institutional dimensions and socio-economic consequences of AI, from the literature on robots taking people’s jobs to research on the business benefits of AI to doomsday scenarios of killer robots and existential threats, and there are numerous studies which recount the impacts of the digital revolution on transformed conditions of identity construction and interpersonal relationships. But the broader impacts taking place in our private and personal lives (especially as regards intimacy, sexuality and love) as a result of the algorithmic phase of the modern era have not been the subject of systematic scholarly study. There have been numerous media articles and popular books, written both by journalists and by tech entrepreneurs, programmers or other ‘insiders’, that provide broad overviews about the evolution of intimacy, relationships and sex in the aftermath of new digital technologies.10 There has been some scholarly attention paid to the potential applications of AI as developed by sex therapists and in couples counselling, though the large bulk of books and articles in this area have mostly concentrated on the advent of sex robots.11 More relevant are the various anthologies which offer informative but rather technical overviews of how the new level of automation afforded by machine-learning algorithms heightens the fragility of social networks and human bonds. Other critics have recognized that the study of predictive algorithms requires more than attention to automated technologies or machine intelligence, and spotlights new kinds of emotional experience and intimate bonds in the age of AI. However, there have been very few studies which seek to explore, in a more systematic and sociological fashion, the algorithmic transformation of intimacy and the social conditions which frame its development, deepening and consequences.

The world in which we live today is transformed by a range of algorithmic calculative devices, logics and techniques which impact identities and intimacies as much as institutions and organizational life. We come to see ourselves differently as we apprehend awareness of both our private and public lives in the invisible pieces of algorithmic code that form the complex digital systems of the machine-learning age. In this book I shall try to show that predictive algorithms are changing the way we experience intimacy, the dynamics of our sexual lives, the cultural forms of loving relationships, our very identities. Predictive algorithms do not only power our cars, our hospitals and courtrooms. They’ve come to be deeply lodged inside us. They’ve learned our private preferences; they tell us how to live, who to befriend and who to date. From this standpoint, one can more than readily understand why some authors have sought to characterize how we live today in terms of ‘algorithmic life’ or ‘algorithmic worlds’.12 But this book is not intended as an addition to the mounting pile of science and technology studies of algorithms. Instead, it tries to investigate the power of algorithms in what I hope is a more original context, one focused on transformations of intimate life and loosely informed by the parameters of social theory. While there has been some academic and public interest in looking at how artificial intelligence reshapes our cultural lives, there has been a marked lack of attention to how machine-learning algorithms remake the intimate relations that define privacy, intimacy and our personal lives more generally. In this book I shall try to show that, if we want to understand the power of predictive algorithms, then we must understand how automated intelligent machines are redefining the threshold of what acceptable, desirable, positive, negative or risky intimate relations mean for both self and society. Throughout the book I shall develop an approach to the rise of predictive algorithms which is primarily cultural, by which I mean an approach which is concerned simultaneously with the intimate or emotional implications of algorithmic processes and their social impacts.

Some Characteristics of Algorithmic Intimacy

How exactly does algorithmic intimacy differ from conventional intimate relationships? What is the complex relation between the rise of algorithmic intimacy and changes in our public and private lives today? In its orthodox rendition, intimacy demanded time, commitment and care for the making of deep emotional connections and the fostering of very significant personal feelings. Intimacy could be by turns exciting, perplexing, compelling and irresistible, but the intricate interplay of care, concern and commitment took centre stage. In its digitalized, algorithmic and above all more automated reincarnation as computer code, intimacy is less about renewed dynamism between people and more about the elimination of unpredictability, uncertainty and ambivalence. In talking about the algorithmic stage of automated intimacy, we are talking about a form of calculated certitude called predictive analytics. Perhaps to say ‘certitude’ is to say too much. But in the algorithmic era, human agency and intimate relationships appear at cross-purposes. Algorithmic prediction, I shall argue at length, can operate as an automation that renders people seemingly mechanical. To appreciate the full dimensions of this algorithmic transformation of intimacy, we need to consider some of the major characteristics of new forms of digital intimacy which are spawned by the machine-learning age. Let us now briefly consider some of these defining characteristics of algorithmic intimacy. These characteristics will be subsequently analysed in more detail throughout the book.

Perhaps one of the most obvious aspects of algorithmic intimacy is that it appears counter-intuitive to established conceptions of human togetherness. It would seem altogether at cross-purposes to traditional cultural understandings and social norms pertaining to the realm of intimate relationships. Broadly speaking, intimacy has a quality of enchantment which largely marks it off from the humdrum routines of ordinary social engagement. Everything in the world can suddenly seem dramatic and exciting when human bonds become strongly shaped by intimacy. Likewise, on the level of personal relations, intense intimacy can be disruptive of everyday life; it connects people to very significant and sometimes very disturbing intensities of emotion. Many traditional social norms and cultural codes governing the conduct of intimate relationships (from friendship to sexual relations) would appear blunted or diminished, however, in the face of machine-learning predictive algorithms and the emergent variety of intimate connections modelled in the image of computational code. An indication of this can be seen in the struggle many critics have had in describing exactly what is happening when people connect, communicate and forge links with each other via automated intelligent machines. ‘Facebook friends’, for example, has been called today’s shortcut for ‘how to be liked by everyone online’, a strange inversion where the word ‘friend’ designates people who might otherwise be complete strangers. Intimate relationships have more or less been characterized by passionate attachment, erotic desire, the particular, the singular, defined by specific subjectivities and captured by concrete cultural codes. Yet machine-learning algorithms would appear the opposite of all this: they redefine intimacy as relationship engineering. The objectives all seem to be in the direction of prediction, copying, re-enactment and repetition. As a popular joke captures this: a machine-learning algorithm arrives at a bar and the bartender asks, ‘What would you like to drink?’ The algorithm replies, ‘What’s everyone else having?’

When we consider the ever-increasing extension of AI into contemporary sexual relationships and erotic cultural forms, the difficulty of arriving at a clear characterization of the digitalization of intimacy becomes even more pronounced. Some commentators have likened predictive algorithms in the sphere of commercial sexual relationships, and especially dating websites and hook-up apps, to an amalgam of glossy adult entertainment, pornography and reality TV. The prescriptive weight of machine-learning algorithms – taken on board daily by women and men through the ‘recommendations’ of Netflix, Facebook, Uber and other data-economy platforms – have been widely viewed in terms of ‘technological solutionism’. Bernard Stiegler speaks, for instance, of the ‘tyranny of digital lifestyles’.13 The lure of ‘predictive algorithmic relations’ appears to be automated to the measure of distancing people from the unpredictability of intimacy itself.

A second characteristic of algorithmic intimacy is that the automated actions, codes and programs associated with machine intelligence typically involve a high degree of obscurity or obliqueness, such that the technical operations and predictive weightings informing algorithmic recommendations rarely become known to participants, consumers or citizens. Another way of putting this point is to say that the social power of algorithms is largely hidden or invisible. Whilst the ways in which people experience and discern algorithms as part of the fabric of everyday life are many and varied, the force field of predictive algorithms operates largely ‘behind the scenes’. Tania Bucher perceptively notes that our ‘different ways of thinking about what algorithms are and do may affect how these systems are used’,14 but the central point to keep in mind is that the social impacts of algorithms are only discerned retrospectively. The social power of algorithms become evident, as it were, only after the event of automated prediction. The rise of algorithmic intimacy has been very largely shaped by these ‘automated fields of encounter’ between people and machines in the social, cultural and political spheres, and specifically influence how the digitalization of intimacy has developed and deepened in the early twenty-first century. Some of this is brought out very well in Machines Like Me, partly because McEwan found an effective literary means to dramatize the impacts of digital intimacy. Whilst people don’t ever directly clap eyes on algorithms in daily life, the image of synthetic humans is, in fact, of another order – largely thanks to the influence of Hollywood science-fiction movies. People, it would seem, have less difficulty imagining and visualizing robots than algorithms. This ease of cultural awareness in imagining robots is arguably McEwan’s substitute for our contemporary encounter with algorithms. Julian Lucas, writing in The New Yorker, asks of McEwan: ‘Why write a novel, in 2019, about a humanoid robot? Like the flying car, it’s a long-anticipated idea that, although not quite obsolete, has begun to feel curiously outdated.’15 Lucas concludes that ‘McEwan is aware of this belatedness’ and perhaps created Adam as a ‘throwback’ in order to underscore that ‘bodies are déclassé in the era of cloud computing’. From this angle, McEwan’s Machines Like Me allows us to see that AI has in our own time not only become less corporeal, but also become increasingly oblique, invisible and ubiquitous. This is a theme that I shall seek to develop throughout this book. One reason why machine-learning algorithms play a crucial role in the transformation of intimacy is that they impact upon people’s lives through any number of technological devices – from smartphones to tablets to laptops. Siri, Alexa and an endlessly proliferating number of chatbots speak to us imparting ‘worlds of information’; the ‘godlike omnipresence’ of these virtual voices, writes Lucas, ‘softened by a tone of relentless compliance’.

A third characteristic of algorithmic intimacy has to do with the kinds of social engagement, and oftentimes the forms of talk, carried out with the assistance of machine intelligence. Automated algorithmic communication is not like speech, even in those digital formats or on those software platforms which might appear most speech-like, such as chatbots and virtual personal assistants (VPAs). There are several key differences between chatbot talk and face-to-face conversation. For one thing, when people communicate face to face there is generally a routine compliance with the social norm to politely focus on what others are saying; people, for the most part, ‘give off the impression’ that they are paying attention to the other person. The American sociologist Erving Goffman labelled this the expectation of ‘mutual attentiveness’, arguing it was one of the ‘core norms’ holding modern societies together. Today, by contrast, as the process of machine learning enters a more advanced phase, pre-existing communication norms are increasingly undermined. Unlike face-to-face conversation, chatbots and VPAs do not demand conversational politeness. When talking to automated intelligent machines, people do not have to be considerate or amusing or even display close attention. Traditional core norms of mutual attentiveness remain, of course, vital to the arts of human conversation, for the most part. But these norms are less secure than they once were, and I have elsewhere posed the question of whether automated intelligent machines are re-engineering the manner in which people conduct face-to-face conversations.16 Might, for example, chatbot talk ‘bleed’ into everyday conversations? Might lukewarm emotion, or decreased levels of affect, displayed in chatbot talk spill over into face-to-face conversations? Might the manner in which people talk to chatbots and VPAs – ‘do this’, ‘order me that’ – negatively impact upon how they talk with intimate partners in everyday life?

Many critics fear that this is how it is as a consequence of relationship tech, where whatever remains of ‘interaction’ is reduced to pre-scripted replies and predictive text technology. In this account, communication mediated by machine intelligence appears as one-dimensional, arriving on the recipient’s device as a relatively ‘closed message’ which rarely invites any deeper engagement. In the same manner that the development of SMS messaging resulted in widespread and frequently used texting transformations in language, it has been argued that relationship tech might result in emotional forms of interaction the equivalent of emojis. Other critics warn that we are already witnessing the ‘algorithmic disinhibition effect’, where a lack of restraint escalates in virtual environments because people are missing empathy cues that typically occur in face-to-face interactions. As a consequence, various relationship and friendship apps have introduced AI-powered prompts asking recipients to reconsider sending potentially offensive or inappropriate communications. Tinder, for example, introduced it’s ‘Are You Sure?’ feature in 2021, in which AI software scans messages sent by users to detect possibly harmful language.

A final characteristic of algorithmic intimacy which should be noted in this opening chapter has to do with the multiplication of forms of emotional connection, affective association and relational affiliation. Whatever else the intimacy of yesteryear might have entailed, it was above all both selective and sequential. Traditional forms of intimacy were largely of the ‘you-are-the-one’ model of singular uniqueness. Such step-by-step, one-at-a-time standards governing intimacy were partly the result of the limitation of human powers in the context of face-to-face social encounters. The message of traditional cultures of intimacy was one resolutely focused on practices of selection and exclusion because individuals were unable to pay attention to all possible interlocutors; it was neither feasible nor even conceivable to display the necessary kind of attentiveness to all possible interlocutors who might have been vying for attention at the same time. The digital lifestyles of people today searching for experiences of intimacy are radically different. Today it is possible, thanks to automated intelligent machines, to join simultaneously in multiple conversations and multitudinous intimacies. Multiplication may indeed be the most conspicuous feature of algorithmic intimacy, whereby women and men seek to coordinate relationship possibilities without becoming integrated or overcommitted. Illusion or not, automated machine intelligence promises people as many kinds of intimacies as their emotional capacities can tolerate and their digital skills permit.

This multiplication of intimate relations is heightened by the fact that, as more and more connections (economic, social, cultural, personal) become automated, people need to continually download, update and integrate data into their lives and lifestyle activities. Intimate relations may be increasingly instantaneous, simultaneous and automated – that is, it may take less and less time to generate ‘connections with others’ – but, paradoxically, more time needs to be allocated to blend, fuse and amalgamate the data generated by these multiple smart machines. Automated intelligent machines involve ongoing reconfigurations of ‘personalized networking’ – as one moment people engage the capacity for the ‘semi-autonomous’ processing of information, communication and services, and the next moment engage in the ‘self-retrieval’ of data and personal information. This paradox has been outlined by Helga Nowotny in a dystopian vision where people become the ‘time slaves’ of automated global complexity. ‘We have to allocate time’, writes Nowotny, ‘to use what we are being offered – be it transport, communication, entertainment, or access to information, which needs to be downloaded and integrated.’ This much-sought-after integration is, however, largely illusory. Such is the complexity of automated feedback mechanisms, coupled with ever-new technological breakthroughs, that people discover – only slowly, in fits and starts, and sometimes painfully – that there is never enough time in the day to recalibrate or retrofit all the data accumulated in such automated systems. ‘The speed with which data is processed and recombined’, cautions Nowotny, ‘far exceeds the capacity of human perception. All we can do is register the effect.’ The great crisis of data fatigue is a key component of algorithmic intimacy and has the potential to become immensely damaging to personal life and social bonds, as well as the wider fabric of society.

The Argument of Algorithmic Intimacy

This book establishes the current scale, intensity, acceleration and impact of digitalization processes upon intimate relationships and especially how machine-learning predictive algorithms are remodelling human behaviour and transforming social practices. In this book, I will examine a vast array of predictive algorithms which are changing the boundaries of public and personal life today. Whilst automated intelligent machines impact many fields of activity in contemporary social life, the realm of intimacy is of special significance in some major respects. The promise of intimacy has come to mean the promise of deeper and more satisfying relationships at both the personal and cultural levels. But rather than the digital revolution fostering free and open communication in the interpersonal sphere, many people today worry that intimate relationships are rendered increasingly fractured, fragmented or empty as a result. My starting point is that AI-powered automated intelligent machines are intricately interwoven with how intimacy is being rewritten today, as well as changes impacting personal life on a more general plane. Focusing on changes in intimate relationships and personal life are helpful for understanding how the world of predictive algorithms ‘weigh in’ on our expressions of emotion and our practices of care and commitment to ourselves, other people and the wider world.

The fast-changing contours of intimacy, eroticism, sexuality and love are enormously varied and hugely complex, and it is not feasible in a single volume such as this to attempt to deal with the many different aspects and intricate threads of change in intimacy arising in relation to digital technologies. In order to keep the book manageable, I address three basic types of intimacy impacted by machine intelligence. My focus here is on (1) relationship tech, (2) therapy tech and (3) friendship tech.

Relationship Tech, concerned with the ways algorithmic conceptions of intimacy are influencing sexual relations, dating, marriage, family, eroticism and love.

Therapy Tech, concerned with the ways algorithmic conceptions of personhood are influencing therapeutic mental health, specifically well-being, welfare and autonomy.

Friendship Tech, concerned with the ways algorithmic conceptions of companionship are influencing mutuality, interpersonal bonds, communication and sociability.

I won’t be looking at other fields of technological innovation impacting intimacy – at the rise of sex robots, for example. The socio-technical fields of relationship, therapy and friendship tech are impacting personal and cultural life in quite different ways and with significantly different consequences. My focus is not robotics, but advances in artificial intelligence more generally. The book examines how a burgeoning culture of AI is changing our ideas about intimacy, relationships, identities and machines. We shall encounter quantified sex and algorithmic dating apps, TikTok friends, chatbot therapists and automated intimacy profiling. Virtual personal assistants such as Siri and Alexa, too, form part of this story – as their ‘services’ help automate dealing with the stresses and strains of intimate relationships. We’ll encounter algorithms used by women and men to help decide with whom they wish to have sexual relations – as one-night stands or longer-term romances. We’ll find algorithms used by people to design a new sense of identity and ‘reinvigorate’ existing intimate relationships. We’ll pay close attention to the claims and counterclaims of tech entrepreneurs, software engineers and computer programmers; examine the responses of users of automated predictive algorithms; and confront the changes brought about by machine intelligence – including the unanswered questions they raise for intimate and personal life.