Deepfakes - Graham Meikle - E-Book

Deepfakes E-Book

Graham Meikle

0,0
15,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

What happens when we can no longer believe what we see? Show the AI technologies that create deepfakes enough images of a celebrity or a politician and they will generate a convincing video in which that person appears to say and do things they have never actually said or done. The result is a media environment in which anyone's face and image can be remixed and manipulated. Graham Meikle explains how deepfakes (synthetic media) are made and used. From celebrity porn and political satire to movie mash-ups and disinformation campaigns, this book explores themes of trust and consent as face-swapping software becomes more common. Meikle argues that deepfake videos allow for a new perspective on the taken-for-granted nature of contemporary media, in which our capacity to remix and share content increasingly conflicts with our capacity to trust. The book analyses how such videos deepen the social media environment in which the public and the personal converge, and in which all human experience becomes data to be shared. Timely, clear, and accessibly written, this is an essential text for students and scholars of media, communication, cultural studies, and sociology as well as general readers.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 258

Veröffentlichungsjahr: 2022

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Title Page

Copyright Page

Acknowledgements

Introduction

Approaches

Overview

1 What Are Deepfakes?

‘Fuck the algorithm’

Nancy Pelosi’s drinking problem

Summary

2 Synthetic Porn

Ethics and visibility

A brief prehistory of deepfake porn

MrDeepFakes

Celebrity skin

Hard truths

Domesticating deepfake abuse

Summary

3 Remix Aesthetics and Synthetic Media

Unlimited Dada

What you are about to see is not real

Bringing back the dead

Brilliant disguise

Trumping the president

Life hacking

Summary

4 Manipulating Trust

Re-recording history

Reckoning with trust

Manipulating machineries of trust

Photoshop disasters

Manipulating texts, manipulating contexts

Summary

Conclusion

Legal mechanisms

Market mechanisms

Deepfake detection systems

Synthetic media literacy

Last thoughts

References

Index

End User License Agreement

List of Tables

Chapter 2

Table 1

. Celebrities who have most often been the subject in MrDeepFakes videos as of Feb...

Table 2

. The twenty all-time most-viewed videos on MrDeepFakes as of February 2022

Guide

Cover

Table of Contents

Begin Reading

Pages

iii

iv

vi

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

184

185

186

187

188

189

190

191

DEEPFAKES

Graham Meikle

polity

Copyright Page

Copyright © Graham Meikle 2023

The right of Graham Meikle to be identified as Author of this Work has been asserted in accordance with the UK Copyright, Designs and Patents Act 1988.

First published in 2023 by Polity Press

Polity Press

65 Bridge Street

Cambridge CB2 1UR, UK

Polity Press

111 River Street

Hoboken, NJ 07030, USA

All rights reserved. Except for the quotation of short passages for the purpose of criticism and review, no part of this publication may be reproduced, stored in a retrieval system or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the publisher.

ISBN-13: 978-1-5095-4820-0

ISBN-13: 978-1-5095-4821-7 (pb)

A catalogue record for this book is available from the British Library.

Library of Congress Control Number: 2022934669

by Fakenham Prepress Solutions, Fakenham, Norfolk NR21 8NL

The publisher has used its best endeavours to ensure that the URLs for external websites referred to in this book are correct and active at the time of going to press. However, the publisher has no responsibility for the websites and can make no guarantee that a site will remain live or that the content is or will remain appropriate.

Every effort has been made to trace all copyright holders, but if any have been overlooked the publisher will be pleased to include any necessary credits in any subsequent reprint or edition.

For further information on Polity, visit our website: politybooks.com

Acknowledgements

Thanks for all kinds of things to Chris Atton, Eddy Borges-Rey, Mercedes Bunz, Steve Collins, Mary Kay Culpepper & Cullen Clark, Jennifer Fraser, Athina Karatzogianni, Tama Leaver, Louise Murray, Eduardo Navas, Michaela O’Brien, Didem Özkul, Michael Scott & Jorge Chamizo, Phaedra Shanbaum, Pieter Verdegem, Sherman Young, and the lockdown Fridays groupchat crew.

Thanks to Mary Savigar, Stephanie Homer, Justin Dyer, and everyone at Polity for their enthusiasm, patience, and professionalism.

Special thanks to Fin, Rosie, and Lola for being real.

Introduction

Did you see that video of Kim Kardashian? The one where she says, ‘I genuinely love the process of manipulating people online for money’? How about the clips of Facebook’s Mark Zuckerberg where he says that he has ‘total control of billions of people’s stolen data’, and that ‘the more you express yourself, the more we own you’? Or did you see the video of Donald Trump reading kids a Christmas story about a heroic reindeer that gets cheated out of winning an election? Maybe you saw the one of Vladimir Putin dismissing claims he was interfering in US democracy by saying, ‘I don’t have to – you are doing it to yourselves.’ Then again, he was also in that video with all the world leaders singing John Lennon’s ‘Imagine’ together. That one was nice. Perhaps you saw that clip of artist Marcel Duchamp already talking about Big Data way back in the 1950s. That was quite a find. Or that video of David Beckham appealing for support in nine different languages in the campaign to eradicate malaria? If you watch his lips, it looks like he can really speak them all. Or maybe you’ve heard whispers that there’s a website with hundreds of porn videos starring all the actors from the Marvel films. Those can’t be real. Can they?

Those are all examples of deepfakes: videos created or manipulated using artificial intelligence (AI) techniques. Show the software enough photos of Scarlett Johansson, Donald Trump, or yourself and it can generate entirely new video images of those people in simulated situations. Any individual can be shown saying things they have never said and doing things they have never done. Deepfakes are an unprecedented convergence of AI, social media, political communication, pornography, media manipulation, and remix aesthetics. The uses of deepfakes offer enormous creative potential but also threaten many different kinds of harm. Deepfakes can circulate through social media, so we may never be certain of a clip’s original source or of why it is in our feed. They can be harmful to all of us as individual viewers or audiences when they are created to be manipulative and deceptive. They may seek to confuse, exploit, and distort our attitudes and perceptions. Deepfakes are also often harmful to their targets: those whose faces are manipulated into new contexts, particularly pornographic ones, can experience irreparable damage to their reputations and personal lives. Deepfakes, moreover, may be harmful to society as a whole, further weakening trust in public communication and institutions, and enabling new levels of cynicism and mistrust. What happens when we can no longer trust that what we see is real?

This book explains what deepfakes are, and explores the main ways they have been used so far. Deepfakes simulate people’s faces and voices in believable ways. They are examples of the larger area of synthetic media, those created or significantly altered by AI techniques. Synthetic media is an expanding category:

There are many kinds of synthetic media, and the field is constantly expanding to include image and text generation, music composition, and voice production. Just as with legacy technologies, the use or application of synthetic media can be for the civic good or societal harm. There is increasing interest in using synthetic media for art, urban planning, environmental science, and health care. (Ajder & Glick 2021: 9)

So deepfakes should be understood first as part of this emerging field of synthetic media. They can also be understood as part of a broad spectrum of the production and manipulation of images, audio, and video (Paris & Donovan 2019). This can include sophisticated uses of AI machine-learning tools to generate or edit videos, images, or audio. But this spectrum can also include technically simple processes such as slowing down a video to make a speaker appear drunk or confused. For this reason, some commentators contrast the term deepfakes with other terms such as cheap fakes (Paris & Donovan 2019) or shallowfakes (Ajder et al. 2019) to emphasize the AI dimension to synthetic media. But it’s more useful to line these things up together on a spectrum than it is to draw distinctions based on technical production methods. The importance of seeing deepfakes as part of a much wider continuum of synthetic, manipulated, or remixed media is that this allows for both historical perspective and cultural context. There has been image manipulation for as long as there have been images, and we can take lessons from this to help us understand contemporary deepfakes.

This is one of the first books to study synthetic media from humanities and social sciences perspectives, and synthetic media are going to be with us for a long time. Synthetic media are the first manifestations of using AI to create media content. Deepfakes are the first uses of synthetic media to attract attention. This is a fast-moving area, so some of the examples explored in this book may date quickly, while others are likely to persist as landmarks in synthetic media development for a long time to come. I hope the reader can look beyond the examples and find ideas in this book that they will be able to apply to new examples that emerge as deepfakes continue to develop. Mark Weiser once wrote: ‘The most profound technologies are those that disappear’ (1991: 94). By disappear, he meant they become taken for granted: electricity, for example, which we only notice if the power goes off. Weiser observed that such technologies ‘weave themselves into the fabric of everyday life until they are indistinguishable from it’ (1991: 94). AI and synthetic media are not yet at this point. So this is an important moment to catch deepfakes and examine them before they begin to seem normal, and before synthetic media become taken for granted.

Synthetic media are so far mostly used to create non-consensual porn, and mostly imagined to be used in politics and news. The very word deepfake works well to capture these aspects of the pornographic and the propagandistic. But synthetic media are not in themselves exclusively technologies of disinformation or sexual abuse. Synthetic media are not yet bound to a particular media form. They are not yet understood as fully a part of cinema, or of political communication, or of contemporary art, or of advertising. Porn is so far the dominant media form for deepfake applications, but this is contingent: there is no reason to think it will always be that way. In this, synthetic media should be seen in a longer-term perspective of digital media. Computer-generated imagery has never completely belonged to a particular media form, reaching various kinds of peak over several decades in music video, in Hollywood spectacle, and in videogames. In a similar way, the uses of synthetic media are still emerging and still up for grabs.

So some deepfakes act as showreels for their developers, others as warnings of the risks to democracy posed by synthetic videos of leaders with deceptive content. Some deepfakes are promotional content, whether for multi-lingual corporate presentations delivered by a single speaker, or for music videos for acts including Charli XCX, Paul McCartney, and Kendrick Lamar, or for celebrities such as Bruce Willis to license advertisers to use his face without him having to show up to be filmed. Some are made as what-if entertainment, recasting Tom Cruise in American Psycho, Brad Pitt as Luke Skywalker, or swapping Heath Ledger’s Joker with Joaquin Phoenix’s. Some deepfakes are made as contemporary art, sponsored and exhibited by museums and festivals. Such processes are rapidly being domesticated and are by no means confined to high-end art museums. Powerful open-source software such as DeepFaceLab is freely available through developers’ portal Github, although this requires some skill to use in a convincing way. And at the cruder end of the spectrum are smartphone face-swap apps, such as FaceApp or Zao, that allow users to remix faces or edit themselves into movie scenes, or novelty filters on everyday apps such as Snapchat.

In this book, I argue that deepfake videos are not just significant in their own right, they also offer important insights into the wider digital media environment of the 2020s. Deepfakes did not just happen to emerge in the time of social media, but are a product of those media. The limitless datasets of images, video, text, and audio that we have created through two decades of sharing on social media platforms have become raw material that enable machine-learning researchers to train AI systems to recognize, classify, and recreate images. With enough training, such systems can generate entirely new images: copies that have no original. Deepfakes expand the social media environment in which the public and the personal converge. They are a logical extension of those social media business models in which all human experience becomes content to be shared, data to be exploited. Thinking about deepfakes allows for a new perspective on the taken-for-granted nature of contemporary digital media in which our capacity to create and share increasingly conflicts with our capacity to trust.

Trust is central to ideas of communication and community. Trust is fundamental to developing and maintaining a sense of community through time and across space. Trust connects with ideas of truth, belief, faith. Both news and political communication are built through mechanisms for the manufacture and maintenance of trust. Meanings are not just sent, they are created together with others. Trust is a central element in this creation of meaning: belief in the reliability of a message; confidence in its truth; recourse perhaps to faith. Without trust, communication breaks down. In the networked digital environment of the 2020s, our ability to trust is confronted by the near-ubiquitous capacity to remix and share media material. Deepfakes are a significant development in the wider erosion of trust which is affecting experiences of political communication, news, and social media. A key problem for trust in the contemporary media environment is the ways in which consent is being withdrawn or becoming meaningless. Everyone’s face, everyone’s image, indeed all human experience, is now reusable as media content. This is an extension and expansion of the social media business model that was established in the first two decades of the twenty-first century. As it has developed, ever more aspects of private daily life have been appropriated as public data.

In an earlier book (Meikle 2016), I described social media as the sharing industry, noting how the continual emphasis on pushing users to share more photos, more friendships, more opinions, more emotions, and on further pushing users to recirculate those things as they were posted by their friends, had become a central business model of networked digital media. The sharing industry is typified by Meta/Facebook and Alphabet/Google, and by the subsidiary elements of their digital empires, Instagram and YouTube. The same remorseless logics of data creation, capture, and circulation are also engines of the other digital behemoths – Apple and Amazon and Microsoft, Tencent and Alibaba and Baidu – and of other leading players from Twitter to Spotify, Netflix to TikTok.

In a subsequent book about the internet of things, co-written with digital theorist Mercedes Bunz (Bunz & Meikle 2018), we traced how this business model was being expanded into ever more intimate parts of everyday experience: how we were no longer pushed to share just our photos and address books, but also our shopping lists, our daily step-counts, our sleeping patterns, our calorie and alcohol intakes, our hormone cycles and heart-rates. The point was that none of this had ever been mediated before, and that this was exactly why it was of interest and value to the sharing industry.

As this invasive practice of everyday life has become more familiar and more taken for granted, so the cultural lines for its acceptance have shifted. It has become normal, if not yet natural, to yield up ever more access to ever more aspects of our lives. The kind of consent that we give in clicking through screens is not meaningful, but once given it means that we did not say no. Surveillance is no longer just the domain of the state, but instead we are developing what David Lyon (2018) describes as a culture of surveillance. This share-and-share-alike environment of behavioural profiling and commercial targeting creates the conditions for deepfakes: anyone’s face is now just more zeros and ones; any new context in which that face can be put is just more content; anything at all is just there to be taken, to be used.

So deepfakes reveal something much more general about our contemporary digital condition. It is not just superstar actors who are being objectified and manipulated, but all of us. It is not just that we are reduced to our data, but worse than that: we are reduced to other people’s data. Deepfakes show us the contours of the environment in which we all now live. It is an environment in which resistance and consent to digital exploitation are both being made meaningless. An environment in which all human experience is just content and data to be manipulated and remixed.

Approaches

Deepfakes are about creating something new from existing material: about editing, manipulating, juxtaposing, connecting, counterpointing, or subverting images, audio and video. So when I began researching deepfakes, I first thought of this book as one about remix. I’ve always been fascinated by the kinds of creativity that are made possible by putting together things that already exist, particularly when this is done for satirical or subversive reasons (Meikle 2002, 2007), and I first wrote about remix in 2008 (Meikle 2008). One way of approaching deepfakes is to connect them with wider currents of remix creativity, and how this is now a central part of everyday digital life. Think of the ways that daily use of social media involves making new meanings by reworking found material, whether running pictures through filters or setting links in new contexts. Thinking of deepfakes as remixes connects them with prehistories of pranks and parodies, of hoaxes and satires, that go back decades. It also opens up darker prehistories of image manipulation, propaganda, and disinformation.

In this book, I use the terms remix and manipulate to describe the same processes of deepfake media. Both words describe ways of creating with found material, but I distinguish between them to discuss different topics in the following chapters. I write about remix when the focus is on art, creativity, education, satire, or entertainment. I write about manipulation when the focus is on disinformation or non-consensual porn. This distinction between remixed and manipulated media is to avoid ambiguity in the discussions of topics that might seem to have creative potential (remix) or potential for harm (manipulation).

Questions of remix became central to digital media and web cultures early in the twenty-first century. Ideas about remix cultures drew upon collage aesthetics and theories from throughout the arts and literature of the twentieth century: concepts of appropriation, of subversive juxtaposition, and of creative combination, from cinema editing, from visual arts, from modernist literature, and from improvisational music from jazz to DJ cultures. All of these cultural currents converged with digital technologies as the millennium turned, and the consolidation of the web and other digital media found expression in DIY cultures, tactical media, culture jamming, Web 2.0, and user-generated content, along with attempts to imagine new approaches to intellectual property that could accommodate these, such as Creative Commons (Navas, Gallagher & burrough 2015a: 1).

Theorists such as Lev Manovich (2001, 2006, 2007), Paul D. Miller (DJ Spooky) (2004), and Lawrence Lessig (2004, 2008) started to develop new approaches to cultural production, distribution, and reception that addressed questions of remix. Lessig built a series of influential arguments around grassroots creativity, digital technologies, and intellectual property. Riffing on the computing term ‘read-only’, he argued that the mass media environment of the twentieth century had been a read-only culture in which most media involved a small number of people talking to much larger audiences, and that the new century could instead be ‘both read and write’ (2004: 37). Like Lessig, Miller approached remix as creativity. Both considered the ethical dimensions of remix, sampling, or appropriation. In Lessig’s analysis, this was about intellectual property and the need to reform stifling copyright regimes in order to foster grassroots creativity (2008). In Miller’s, it was about cultural recognition: he called sampling ‘ancestor worship’ (2004: 65) and asked, ‘Who speaks through you?’ (2004: 37).

As early as 2006, Manovich could already observe that: ‘It has become a cliché to announce that “we live in remix culture”’ (2006: 209). The way beyond that cliché was to theorize remix more precisely, and a new area of remix studies began to coalesce around this project as a loose academic discourse orbiting concepts of creativity, cultural (re)production, ethics, activism, and copyright (Sinnreich 2010; Navas 2012; Ferguson 2015; Gunkel 2016). Much of the literature on remix is normative and celebratory, often crossing into advocacy (see, for example, many of the essays collected in Navas, Gallagher & burrough 2015 and 2018b). Lots of the remix studies literature is rooted in fan cultures; in defending practices of pastiche and détournement as both artistically and ethically valid; in arguing for the rights of the fan or grassroots creator over the corporation; in defending remixes against intellectual property laws or other conceptions of authorial creativity. ‘Appropriation is activism’, as one remix scholar puts it (Russell 2015: 217). But deepfakes push the limits of this discourse. A non-consensual deepfake porn video of an actor from a Marvel movie is recognizably within the video remix currents of the last twenty years, but can’t be defended with the same arguments about creativity, ethics, or intellectual property. So how can ideas of remix help us understand deepfakes? Think of it this way: to remix is to create with found material, and with deepfakes the found material is us.

To explain how deepfakes can be understood as remixes, and why this matters, I want to compare two important remix art projects, one from the emergent phase of remix cultures at the start of the twenty-first century – Rebirth of a Nation by Paul D. Miller – the other from the emergent phase of deepfakes at the start of the 2020s – Warriors by artist James Coupe. Rebirth of a Nation is a multimedia remix first performed in 2004 (http://djspooky.com/rebirth-of-a-nation). It’s a transformative edited version of D.W. Griffith’s 1915 feature film The Birth of a Nation, which was itself celebrated for its influential editing, its use of parallel narratives, and its array of novel cinematic transitions and shot techniques. Griffith’s multiple storylines trace the introduction of slavery to America, the US Civil War, the assassination of President Lincoln, and the rise of the Ku Klux Klan in the post-war Reconstruction period. A huge success in its day, Griffith’s film is largely unwatchable for many twenty-first-century viewers for its overt racism, its casting of white actors in blackface, and its depiction of the Ku Klux Klan as heroic. These have been controversial throughout the film’s history: in 1956, Situationists Guy Debord and Gil Wolman used The Birth of a Nation as an example to introduce their concept of détournement. They proposed that Griffith’s film should be subverted by ‘adding a soundtrack that made a powerful denunciation of the horrors of imperialist war and of the activities of the Ku Klux Klan, which are continuing in the United States even now’ (Debord & Wolman 2009 [1956]: 37). DJ Spooky’s remix project does exactly that.

Rebirth of a Nation presents a highly abridged version of Griffith’s three-hour film, edited to emphasize its most racist elements, bringing into focus the blackface performances and the depiction of the Klan. Animated overlays and on-screen diagrams highlight specific characters or impose new perspectives on scenes. The many intertitles through which Griffith delivers much narrative exposition are remixed to credit the film to Paul D. Miller with a PDM logo. There is a film version available as a DVD, but each live performance is also a remix of this version itself, and of its accompanying score, composed by DJ Spooky and recorded by the Kronos Quartet, which he remixes in real time onstage to draw on the moment and the space of the performance. Miller described his project to one film scholar as a ‘digital exorcism’ intended to drain Griffith’s images of their power by making them ‘absurd’ (McEwan 2015: 89). In its complexity and ambition, Rebirth is a powerful example of twenty-first-century remix cultures, bringing digital techniques and sensibilities to bear on a landmark text of early cinema. It is creativity that very explicitly works with found material: Griffith’s film. To see how deepfakes can also be understood as remix texts, but with a crucial twist, let’s compare this example with a more recent one.

Warriors is a deepfake art installation by James Coupe, exhibited at New York’s International Center of Photography in 2020 (http://jamescoupe.com/?p=2658). The project draws upon Walter Hill’s 1979 feature film The Warriors by remixing the faces of visitors to the exhibit into scenes from Hill’s movie. The original film tells the story of the eponymous street gang, who have to fight their way from one end of New York to the other after being falsely accused of murdering another gang leader, Cyrus. Incensed by Cyrus’s death, every street gang in the city is out for revenge on the Warriors, and much of the film is set-piece encounters with other gangs, many based around particular demographic characteristics, such as the all-Black Riffs or the all-female Lizzies. Visitors to Coupe’s installation are invited to use one of a set of iPads around the room to take photos of their own face, which are analysed and mapped by deep learning software. The visitors’ faces are then assigned to individual characters from different gangs in Hill’s movie, and appear remixed into scenes from the film on screens throughout the exhibition space.

Coupe’s project uses deepfake technology to explore questions of identity, visibility, profiling, and algorithmic bias. The artwork profiles each individual participant and reduces them to certain demographic characteristics. But the visitor has no way of knowing what criteria the system uses in assigning them to a particular group of characters from the movie. In a filmed interview, Coupe explains how his system uses the ImageNet dataset to categorize faces:

ImageNet is one of the most prominent image classifiers used by AI systems. It’s a dataset of over 14 million images that have been manually annotated by crowd workers. And when an AI model gets trained on ImageNet, it can look at a photo of a person and decide how to label them. It might see them as a cheerleader or a protester or a senator, based upon the photo’s similarity to the images in the categories defined by ImageNet. So, in other words, these systems reproduce historical bias at a mass scale. (WITNESS 2020a)

I contrast Warriors and Rebirth of a Nation as representing two distinct historical moments of remix. The distance between these moments is fundamental to deepfakes. DJ Spooky’s artwork remixes existing media content with his own musical counterpoint, and recuts Griffith’s film to find and suggest new meanings. It’s an example of remix as creating-with-found-material, in which that found material is other media texts. James Coupe’s Warriors project also does this, remixing Walter Hill’s 1979 movie. But Warriors has a very important difference that is central to the digital media environment of the 2020s. Coupe is not just remixing a text, but is also remixing its audience. The viewer’s own face becomes the remixed text. The found material that is remixed and reused here is us.

Deepfake videos reveal how today’s networked digital media systems take each of us as individuals and process us into data. We are analysed, classified, and profiled with every daily interaction, and the systems that do this run on algorithmic processes that are not open or transparent. Such processes can reinforce and amplify existing social biases and inequalities (Noble 2018; Crawford 2021). What happens to the gallery visitor’s face in Coupe’s Warriors parallels the broader social uses of datasets of images in facial recognition systems. These systems encode existing social inequalities of gender, ethnicity, or class into technologies of visibility. In doing this, these processes remix us: they take our lives and identities as raw material and create us anew as profiled data subjects. What gets remixed now is not just old movies, but you and me.

Synthetic media involve sophisticated uses of AI and machine-learning technologies to create entirely new material or to rework existing material in ways that are not possible otherwise. Manipulated media are texts or images that have been edited, remixed, recontextualized. This may be for commercial reasons or for political ones, or both. There are long prehistories here, of propaganda and subversion, of collage and cut-and-paste cultures. Manipulated media need not involve sophisticated machine-learning systems or neural networks – a pair of scissors will do. I approach both synthetic and manipulated media in this book as practices of remix, of creating with found material.

In one of the first important research reports on deepfakes, Britt Paris and Joan Donovan (2019) map out a spectrum of technical sophistication that they describe as ‘The Deepfakes/Cheap Fakes Spectrum’. The more technical expertise and computing resources are required, the further an audio-visual example moves towards the deepfake end of the spectrum; the more technically simple and accessible, the more it moves towards the cheap fakes end. In this analysis, the deepfakes end of the spectrum includes:

virtual performances, such as those used in major Hollywood productions to reanimate dead actors;

face-swapping, such as the celebrity pornography that first produced the word

deepfake

;

voice synthesis, such as that used to resurrect US President Richard Nixon to deliver a speech about the Apollo 11 space mission that was never recorded in his lifetime; or