How To Be a Geek - Matthew Fuller - E-Book

How To Be a Geek E-Book

Matthew Fuller

0,0
17,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Computer software and its structures, devices and processes are woven into our everyday life. Their significance is not just technical: the algorithms, programming languages, abstractions and metadata that millions of people rely on every day have far-reaching implications for the way we understand the underlying dynamics of contemporary societies. In this innovative new book, software studies theorist Matthew Fuller examines how the introduction and expansion of computational systems into areas ranging from urban planning and state surveillance to games and voting systems are transforming our understanding of politics, culture and aesthetics in the twenty-first century. Combining historical insight and a deep understanding of the technology powering modern software systems with a powerful critical perspective, this book opens up new ways of understanding the fundamental infrastructures of contemporary life, economies, entertainment and warfare. In so doing Fuller shows that everyone must learn 'how to be a geek', as the seemingly opaque processes and structures of modern computer and software technology have a significance that no-one can afford to ignore. This powerful and engaging book will be of interest to everyone interested in a critical understanding of the political and cultural ramifications of digital media and computing in the modern world.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 432

Veröffentlichungsjahr: 2017

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Title page

Copyright page

Acknowledgements

Biographies of Co-Authors

Introduction

Geek

Collaborators

Overview

1: The Obscure Objects of Object Orientation

Languages of Objects and Events

Abstraction, Errors and the Capture of Agency

Stabilizing the Environment

Encapsulation, Exceptions and Unknowability

Conclusion: Ontological Modelling and the Matter of the Unknown

References

Notes

2: Abstract Urbanism

Development of Simulation as a Scientific Practice

Abstraction as Urbanism

Diagram City

Logics

Logic Gates

References

Notes

3: Software Studies Methods

References

Notes

4: Big Diff, Granularity, Incoherence, and Production in the Github Software Repository

Coding Processes and Architectures

Anatomies of Forks

Generations of Versions

Events in the API

‘Post-FLOSS’ Archiving and the Archive as Engine

Diff as Infrastructure

Organizing Incoherence

References

Notes

5: The Author Field

Naming Names

A Genealogy of the Author Field in Word Processor File Formats

Into Identity

The Author Field and Doubt in Criminal Forensics

User ID

Case 1: Author

Case 2: JPEG Metadata

Metadetail

Lingering Certainty

References

Notes

6: Always One Bit More: Computing and the Experience of Ambiguity

The Experience of Number

MMORPGs and the Demand for Experience

Machinic and Distributed Experience of Computing

Ambiguity and Paradox

Fun Becomes Systemic

Machinic Funs

Minecraft ALU and CPU by theinternetftw

Megalithic Chuckles

References

Notes

7: Computational Aesthetics

Medium Specificity

Computational Construction

Ten Aspects of Computational Aesthetics

If – Then

References

Notes

8: Phrase

1

2

3

4

5

References

Notes

9: Feral Computing: From Ubiquitous Calculation to Wild Interactions

Interaction: The Extension of Computation beyond Calculation

The Interactive Paradigm: A Cybernetic Revival

From Cybernetics to Interactive Designs

Feral Computation and Design

Interaction, Society and New Distributed Systems

Some Closing Remarks

References

Notes

10: Just Fun Enough To Go Completely Mad About: On Games, Procedures and Amusement

Let's Play

Game as Block

Procedures

Agar.io

Twitch Plays Pokemon

Ambience and Dosage

References

Notes

11: Black Sites and Transparency Layers

Metaphor as Transparency Layer

Transparency Layers

Flat Designs

API

Black Sites

Total Dashboard

Architectures

References

Notes

End User License Agreement

Guide

Cover

Table of Contents

Start Reading

CHAPTER 1

Pages

iv

vii

viii

ix

x

1

2

3

4

5

6

7

8

9

10

11

12

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

35

36

31

32

33

34

37

38

39

40

41

42

43

44

45

46

47

50

51

52

48

49

55

56

57

58

59

60

61

62

66

67

68

63

64

65

69

70

71

72

73

74

75

76

77

78

79

80

81

84

85

82

83

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

106

107

108

109

110

102

103

104

105

113

114

115

116

117

118

119

120

121

122

123

124

129

130

131

125

126

127

128

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

153

154

150

151

152

155

156

157

158

159

160

161

162

165

166

163

164

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

188

189

190

184

185

186

187

191

192

193

194

195

196

197

198

199

200

201

202

203

204

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

225

226

227

228

232

233

234

229

230

231

Copyright page

This collection © Polity Press 2017

Introduction & Chapters 3, 6, 10, 11 & 12 © Matthew Fuller

Chapter 1 © Matthew Fuller & Andrew Goffey

Chapter 2 © Matthew Fuller & Graham Harwood

Chapter 4 © Matthew Fuller, Andrew Goffey, Adrian Mackenzie, Richard Mills & Stuart Sharples

Chapter 5 © Matthew Fuller, Nikita Mazurov & Dan McQuillan

Chapter 7 © Matthew Fuller & M. Beatrice Fazi

Chapter 8 © Matthew Fuller & Olga Goriunova

Chapter 9 © Matthew Fuller & Sónia Matos

The right of Matthew Fuller to be identified as Author of this Work has been asserted in accordance with the UK Copyright, Designs and Patents Act 1988.

First published in 2017 by Polity Press

Polity Press

65 Bridge Street

Cambridge CB2 1UR, UK

Polity Press

350 Main Street

Malden, MA 02148, USA

All rights reserved. Except for the quotation of short passages for the purpose of criticism and review, no part of this publication may be reproduced, stored in a retrieval system or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the publisher.

ISBN-13: 978-1-5095-1715-2

ISBN-13: 978-1-5095-1716-9 (pb)

A catalogue record for this book is available from the British Library.

Library of Congress Cataloging-in-Publication Data

Names: Fuller, Matthew, editor.

Title: How to be a geek : essays on the culture of software / [compiled by] Matthew Fuller.

Description: Cambridge, UK ; Malden, MA, USA : Polity, 2017. | Includes bibliographical references and index.

Identifiers: LCCN 2016042587 (print) | LCCN 2016058649 (ebook) | ISBN 9781509517152 (hardback) | ISBN 9781509517169 (pbk.) | ISBN 9781509517183 (Mobi) | ISBN 9781509517190 (Epub)

Subjects: LCSH: Software engineering–Psychological aspects. | Software engineering–Social aspects.

Classification: LCC QA76.76.H85 H69 2017 (print) | LCC QA76.76.H85 (ebook) | DDC 005.1–dc23

LC record available at https://lccn.loc.gov/2016042587

Typeset in 10.5 on 12 pt Sabon by Toppan Best-set Premedia Limited

Printed and bound in Great Britain by CPI Group (UK) Ltd, Croydon

The publisher has used its best endeavours to ensure that the URLs for external websites referred to in this book are correct and active at the time of going to press. However, the publisher has no responsibility for the websites and can make no guarantee that a site will remain live or that the content is or will remain appropriate.

Every effort has been made to trace all copyright holders, but if any have been inadvertently overlooked the publisher will be pleased to include any necessary credits in any subsequent reprint or edition.

For further information on Polity, visit our website: politybooks.com

Acknowledgements

‘The Obscure Objects of Object Orientation’ (with Andrew Goffey) was previously published in Penny Harvey, Eleanor Conlin Casella, Gillian Evans, Hannah Knox, Christine McLean, Elizabeth B. Silva, Nicholas Thoburn and Kath Woodward, eds., The Routledge Companion to Objects and Materials, Routledge, London, 2013, and in a variant form in Zeitschrift für Medienwissenschaft. With thanks to Claus Pias, Elizabeth B. Silva and Nicholas Thoburn.

‘Abstract Urbanism’ (with Graham Harwood) was previously published in Rob Kitchin and Sung Yueh Perng, eds., Code and the City, Routledge, London, 2016, and presented at the Programmable City workshop at Nirsa in Maynooth the previous year. Thanks to all at the Programmable City project.

‘Software Studies Methods’ was previously published in Jentery Sayers, ed., The Routledge Companion to Media Studies and Digital Humanities, Routledge, New York, 2016. With thanks to Wendy Hui Kyong Chun and Jentery Sayers.

‘Big Diff, Granularity, Incoherence and Production in the Github Software Repository’ (with Andrew Goffey, Adrian Mackenzie, Richard Mills and Stuart Sharples) was previously published in Ina Blom, Trond Lundemo and Eivind Røssaak, eds., Memory in Motion: Archives, Technology and the Social, Amsterdam University Press, Amsterdam, 2016. It follows research with the co-authors in the ESRC-funded Metacommunities of Code project, led by Adrian Mackenzie.

‘The Author Field’ (with Nikita Mazurov and Dan McQuillan) follows from work on the London Cryptofestival in November 2013. With thanks to Yuk Hui.

‘Always One Bit More: Computing and the Experience of Ambiguity’ was previously published in Olga Goriunova, ed., Fun and Software: Pleasure, Pain and Paradox in Computing, Bloomsbury, New York, 2015. With thanks to Olga Goriunova, Annet Dekker and Anette Weinberger.

‘Computational Aesthetics’ (with M. Beatrice Fazi) was previously published in Christiane Paul, ed., A Companion to Digital Art, Wiley-Blackwell, Oxford, 2016. With thanks to Christian Paul and Beatrice Fazi.

‘Phrase’ (with Olga Goriunova), was previously published in Celia Lury and Nina Wakeford, eds., Inventive Methods, Routledge, London, 2013. With thanks to Olga Goriunova, Celia Lury and Nina Wakeford.

‘Feral Computing: From Ubiquitous Calculation to Wild Interactions’ (with Sónia Matos) was previously published in Fibreculture Journal, Sydney, 2011. With thanks to Sónia Matos and Andrew Murphie.

‘Just Fun Enough To Go Completely Mad About: On Games, Procedures and Amusement’ was previously presented as a paper at the St Petersburg Centre for Media Philosophy; a workshop organized by the ARITHMUS research project at Goldsmiths; and C-Dare at Coventry University. Thanks to Alina Latypova, Evelyn Ruppert, Baki Cakici, Hetty Blades and Scott de la Hunta.

An early version of ‘Black Sites and Transparency Layers’ was given as a talk organized by Robin McKay of Urbanomic at Thomas Dane Gallery, London, March 2015. With thanks to John Gerrard. A later version was presented at the Academy of Fine Arts Nuremburg, with thanks to Kerstin Stakemeier; at the Interface Politics conference, Barcelona, April 2016, with thanks to Jaron Rowan and Pau Alsina; and as an inaugural lecture at Goldsmiths, with thanks to everyone there.

Thanks to Femke Snelting and Sarah Magnan of Open Source Publishing for the cover design, and to John Thompson and George Owers at Polity for an easy and affable collaboration. Many thanks to Fiona Sewell for brilliant diagnoses and surgery of the text.

Biographies of Co-Authors

M. Beatrice Fazi is a Research Fellow at the Humanities Lab, Sussex University, where she works on the cultural theory and philosophy of computing.

Andrew Goffey is co-author, with Matthew Fuller, of Evil Media and translator of books by Félix Guattari and Isabelle Stengers amongst others. He is Associate Professor in Critical Theory and Cultural Studies at the University of Nottingham.

Olga Goriunova is Senior Lecturer in the Department of Media Arts, Royal Holloway, University of London, and is editor and curator of many projects, and author of Art Platforms and Cultural Production on the Internet.

Graham Harwood is a member of the artist group YoHa and Senior Lecturer at Goldsmiths, University of London.

Adrian Mackenzie is Professor of Sociology at Lancaster University and author of numerous books including Wirelessness: Radical Empiricism in Network Cultures.

Sónia Matos is a designer, Lecturer in the Department of Design, University of Edinburgh, and a Research Fellow at the Madeira Interactive Technologies Institute.

Nikita Mazurov is a security and forensics researcher and Postdoctoral Researcher at the Living Archives Project, University of Malmö.

Dan McQuillan is Lecturer in Creative and Social Computing at Goldsmiths, University of London, and science and technical lead for Science for Change Kosovo.

Richard Mills is a Research Associate at the Psychometrics Centre, University of Cambridge, and a researcher in the statistical analysis of online data.

Stuart Sharples is a Senior Research Associate in the Department of Sociology, University of Lancaster.

Introduction

The mode of knowing software is not yet established. We are still at a point where a critical language to understand the wider domain of computational culture is only beginning to ferment. There is a lot of writing about software, a lot of writing in and as software, and a lot of conjuration of alphanumeric strings that is done by and as software. Writing on software which works through a critical or speculative mode is not especially different from these things; it inhabits some of the same modes, idioms and intellectual habitats. It works, at one scale or another, with the same logical aggregates. Writing on software is thus at least partially inside software, even when presented as a paper book. The point in this collection of texts is to work into a few places within that condition.

Due to the complexity and variety of entities and processes that produce computational cultures, and the multiplicity of ways in which they can be understood, experienced, put into play, there is a movement between modes of writing here. Some of the texts are musings, almost dazed, gazing into the screen, across a slither of cable, or into the genealogies of logics in an attempt to discern a murmuring between registers. Others are more programmatic essays that attempt to find the intersections between certain combinations of parabola of enquiry and to draw some locations, speeds and movements out from them. Concatenations of numbers drawn from systems are arrayed and compared, alongside ideas about what numbers are.

To write about software cultures, then, is not to attempt to stamp an order on it, but to draw out some conditions that are particular to the way in which computational systems, and the different scales of their articulation, from minor aspects of interface to globally oriented mechanisms, can be brought into resonant communication with ideas and questions that extend them beyond the private conversations of technical experts into wider conjugations of ways of asking questions and making problems. Equally, much of the work here operates with the concepts of computer science as fundamentally cultural and political, as something core to contemporary forms of life and thus as open to theoretical cultural exploration as much as architecture, sexuality or economics might be. Indeed, a proposition of this book is that computing is aligned with all three of these, and other factors, in numerous ways and that tracing these conjugations can give us some information about contemporary life that would be unavailable otherwise. This is to say, too, that whilst the work to address software as culture is currently taking many forms, to reduce it solely to the stable sets of categories and objects known to existing disciplines is to miss the historical moment.

Geek

To be a geek is, in one way or another, to be over-enthused, over-informed, over-excited, over-detailed. There is an awkwardness born out of a superfluity of an extraneous kind of desire that becomes a febrile quiver in the face of an interesting problem. To be geeky is to have too much interest in something to the detriment of comportment, code spilling over into a gabble, a liveliness found in something that a more reserved protocol would keep under wraps or avoid. To be a geek is to be a bit too public with your enthusiasms, to be slightly unaware in turn that these thrills may, to others, rightly be dull as dust dehydrated with a special process of your own invention. Its mixture of juiciness and dryness, being able to get juiced on dryness, perhaps gets to the core of the problem. Frankly, it's a ludicrous position to be in – it is after all a bit bewildering to find this stuff so fascinating – but it is one whose perversity puts it in a strange relation of proximity to fundamental dynamics in contemporary life.

Such a condition leads many geeks into precariously powerful positions. Companies founded and staffed by geeks rule the world in many ways. They fill institutions and create commercial entities. One can be both ludicrous and lucrative, a maniac for certain details that remake the cosmos by their syntax, or that found a new grammar of relation between things. Geeks created the internet and fight over its meaning. They govern and subvert governance, or keep it ticking over with regular incremental upgrades. Geeks produce extravagant contraptions that cement their positions in the most comedically venal ways, but they also make machines with panache that auto-destruct in deserts and car parks and servers; sometimes intentionally. They make games that provide the grounds of individuation for millions, and then find the wealth it occasionally brings depressingly pointless but irresistible.

Given this, geeks may often mute themselves, try and pass as underwhelming. This is probably an adequate survival strategy in many circumstances. An alternative to it is a form of intellectual cosplay: learn to grow and stretch out your membranes to generate new kinds of sensitivity to the present. In the growth of such tender surfaces and depths, become slightly perturbing (as in the Dutch word gek, which shares a common etymological root with geek, meaning mad, or crazy), develop the capacity to feel the interaction of electromagnetic waves with your anorak of ideas. Work their ordering via logic into concatenations of circuits that then propitiate further ordering. Sense on into their interaction with other forces arrayed as fields and intensities at the scales of the social, the aesthetic, the ideational or economic. Learn to trace, cajole and heighten the fatal and illuminating crossings of wires and desires.

Such an unfolding and an invention of sensitivities means also some reflection on the figure of the geek, as something both powerful and flawed. The geek tragedy improves on the traditional mixture of these two qualities by adding the factor of technology. This variant of the tragedy is there since that of Icarus, who flew too close to the sun and melted his wings. If the traditional reading of Icarus is a warning about the hubris of knowledge in relation to nature and the inviolable space of the gods, technology folds in both violation and knowledge as constitutional factors. Any attempt to ‘White Box’ technology as a simple Good that needs no examination misses this fundamental transformative characteristic. But contemporary technology is not simply an extension of a man – like wings of wood, wax and feather – which would be a purely mechanical effect. (At the very least, the physics of the nineteenth century, which was radically changed by the fields and waves of electromagnetism, should have curtailed that conjecture.) It is one whose characteristics are also profoundly mathematical and logical, bringing the force of abstractions into combination with materialities as diverse as labour, the primitive accumulation economies of extraction (for making metal-hungry devices), and into the complexities of contemporary cultural geopolitics that move from the interpersonal to the international with unaccustomed rapidity. The powers of electronics, and the ability to couple and intensify those with logical descriptions that create processes that traverse the social, economic and cultural, intensify in turn the powers of fascination and the ability to invent that are characteristic of geeks. We have yet to find ways to think through the full combination of these forces and scales of articulation. As luck, which begins every tragedy, would have it, however, we are in the midst of all this and have at least the capacity to geek out on the problematic. The texts collected here are different kinds of attempts to do just this.

Collaborators

But talking of geeks: crucial to many of the texts brought together in this book is that they are collaboratively written. My companions in their writing are a designer, programmers, computer scientists, security researchers, statisticians, social scientists, philosophers, cultural theorists, a reluctant artist, and combinations of these. Such collaborations bring and induce capacities, ideas, combinations and indeed data that would otherwise not come into communication with each other. Key to software studies approaches is that of engaging with texts, ideas, objects and systems from computer science and digital technologies in dialogue with those of cultural theory, digital media cultures, art, hacking and elsewhere. People who embody and enliven this dialogue in different ways are fundamental to this book and provide the corrugation necessary to make these papers stand up.

As a field, software studies has so far successfully remained minor in an era of the consolidation of disciplines – which survive, and strengthen, primarily as buttresses against institutional attrition. This ‘success’ of course is presided over by the irony that, since it emerges to address some of the key features of the last decades, these features are sometimes more or less attended to by other approaches, more disciplinary in nature, that then stand in for a more fundamentally inter- or anti-disciplinary form of knowledge. Here, elements of computational cultures operate as metonyms for what then is magically imaginable as a resolvable whole; or are addressed solely in terms of the categories, tools and formats of such disciplines without taking the risk of their transformation. It is the tendency towards the wholehearted embrace of the contact zone that characterizes software studies.

This is not a fully accomplished, uncomplicated or universalizable proposition, however. Each of the collaborations here has its own texture rather than a generalized or un-preworked wide-openness. The patterns that each essay brings together create resonances across distributions of knowledge and skill that create different tensions and confluences. The allocation of terminology or inference, or in the discussions and iterative decisions as to what constitutes a useful fulcrum for levering up and analysing the entity or processes under discussion, provide forces around which the text aggregates. Indeed, in some cases here, the operation of programs and scripts, their relations to systems and data, or the use of forensic techniques, provide another set of collaborators that in turn generate the texts here.

Overview

Collectively, this book presents an assembly of explorations of some of the fundamental infrastructures of contemporary life: computational structures, entities and processes that undergird, found and articulate economies, entertainment and warfare, to name simply their interactions with the ascendant holy trinity. This is to say that software is considered to be fundamentally cultural. Working with the objects and ideas of computer science as part of culture is to say that they take part fully in numerous ways of life; that they are inscribed by ideas and produce feelings and relations through and incidentally to these; and, as a scientific and technically generative set of fields, that it has numerous consistencies and textures within it that are both ‘internally’ significant within those fields, with their own codes, powers and sources of fascination, and, in these more or less abstract or autonomous conditions, able to create alliances with and patterns of excitement, disequilibria and amplification for other aspects of culture, asymmetrically ranging in their iterative acuity in relation to things as disparate as alphabets and institutions, as caresses and contracts. It is the interplay and development of the internal formulations that in many senses excites geeks. But it is their interleaving with these other aspects of software as culture that also changes the position of geeks today. Geeks are not only a ‘people to come’ – inhabiting a world ahead of its realization in ideas or in artefacts, or in selves, by means of what might retrospectively become their precursors – but are also brought into being as entities that inherently entail a kind of spasm of parody, of silliness. This, alongside the way in which the logical foundations of computing render it unstable at different scales, and are therefore often scaffolded, corralled, against this logical unfinishedness at others, is part of what initiates the figure of the geek, and the processes of computing open to their outsides at different scales.

But these logical foundations are also what allows for a remarkable continuity of form – through the generality of the abstract machine, a metamachine that computing puts into place – that sets up much of the tension and generativity in computing. Between abstraction and control, between replicability and translation, between generality and precision, computing forges and sometimes enforces new conjunctures, tensions and orderings. That is why software is both experiential and political, both iterated through the human yet markedly posthuman, as well as being multiply tied into the political mechanisms that seem only able to offer themselves in terms of a charade of humanism (as its mirror, as its tools, as its better self, as its means of knowledge and communication, as something that will iteratively deduce its secrets and its ailments, and so on). At the same time these political mechanisms curtail the humanism they claim to enhance, enforce seriality, tie it ever more closely into transactional mechanisms that translate to capital as a twinned and increasingly informational abstract machine where the individual, whom it ostensibly lauds, is merely figured as a nexus of contracts and flows of credit units and data. The essays collected in this book are aimed in different ways at prising open this condition.

They are published in thematic clusters: Histories, Entities, Aesthetics, Powers. The Histories section develops accounts of two key objects in computing history, object orientation and agent-based modelling. The chapter entitled ‘The Obscure Objects of Object Orientation’ charts a cultural and critical history of this fundamental form of programming through the development of object-oriented programming in educational psychology and social democratic forms of workplace governance to its present dominance in both the conceptual modelling and the development of software to its position in the global software authoring economy.

‘Abstract Urbanism’ provides a genealogy of key projects and thinking in the domain of urban modelling from the 1960s onwards, developing an account of the way in which it embodied critical and speculative approaches to mathematics, logic and city and social processes, with sometimes dubious and often creative significance. Modelling as a form of thought, as a means of experimentation with parameters and actions, has become one of the most subtle and advantageous computational operations. Its actual and ostensive power authorizes and encourages decisions. Understanding the computational part and the evidential grounding of such decisions, as well as the way they are socially located and conceptually and technically delimited, is crucial in developing the capacity of modelling for thinking through cities as condensers of social and technical forces.

The Entities section presents ways of addressing analyses of software cultures at different scales and via the different kinds of objects or entities that they entail. It starts with a chapter outlining a brief survey of ‘Software Studies Methods’. Since software is so heterogeneous and so multi-scalar, this chapter is necessarily incomplete as a survey, but it points towards a range of different ways for engagement, analysis and reflection that are being developed. Around these are numerous clusters of researchers, more or less in communication about what come into being as mutual or distinct modes and points of enquiry. Shared amongst them is the sense that, whilst software is not absolutely distinct from other factors, such as, most obviously, hardware, it has sufficient specificity and significance in its own terms to demand enquiry and speculative development. Whilst terms, such as algorithm, code, or infrastructure and others, may move in and out of focus of different kinds of collective attention over time, numerous kinds of continuity are also developed by the field.

The section then goes on to two chapters that describe and analyse two exemplary kinds of object in contemporary software culture. ‘Big Diff’ takes as its object a large-scale distributed software repository, Github. Merging data science techniques with a theoretically grounded analysis, this essay provides both a reading of this major part of contemporary software development infrastructure and what is a hopefully methodologically interesting work in between the two approaches. Github, which is built on the file-management system Git, developed by Linus Torvalds, is particularly interesting as a site of production because it is symptomatic of the way in which some of the conditions of software culture, derived both from computational forms and from cultural tendencies that are manifest as conventions realized in software, drive forms of work, communication and value-creation in ways that are articulated through software's work on itself. The way in which Git and Github re-order the question of the nature of the archive is particularly symptomatic here. Like much on the internet today, Github is an architecture that centralizes the distributed structure of Git – a file structure that dramatically inverted a social taboo (that against forking, duplicating and varying, code) as a means of establishing a collective resource. In turn, the site also provides the grounds for what this article suggests is a Post-FLOSS (free, libre and open source software) form of code. But software's work on itself is something that also figures as a subtext of the book as a whole, indicating both the developing complexity and maturity of aspects of software cultures and the degree to which it also becomes something reflexive.

The following chapter, ‘The Author Field’, takes a more microscopic object as its approach, developing a genealogy of a specific piece of metadata, as named in the title. Forensic and counter-forensic literature and techniques are used to examine the modulation between apotheosis and disappearance this term is subjected to by the technical operations of mundane technologies such as word processors. What are the ways in which forms of writing are produced, stabilized, repeated, transducted in such processing – and how does this articulate the figure of the user, author or other such category? In this sense, like some of the other texts gathered here, this chapter develops a line of enquiry also present in a previous book of mine which shares this one's subtitle, Behind the Blip: Essays on the Culture of Software, with which there is a strong sense not only of continuity of topic, but also of the widely expanding nature of the grounds of enquiry.

Shared also with that book is a set of attempts to address the aesthetic dimension of contemporary software cultures. Aesthetics is meant in the broad sense here, of addressing the means by which sensation and perception are involved, at deep levels, with the technical. As a field, software studies draws heavily from art, including the early software art movement, for some of its sensibility. Both are heavily involved in working through the different scales and modes of circulation of software as sites saturated in machinings of various kinds: interpretation which conjoins knowing and transforming, perceiving and doing; interaction with its fusing of experience and technique; hacking with its bringing together of cultural invention and mischief as a mode of subjectivation. Here, art is founded on the capacity of invention that relies on multiplicities of meaning and the absence of any final vindicating authority as the grounds for creation. Reflection on technological experience via technological means as they weave into and interfere with those of other scales is a means of taking them beyond the question of their intents or their purposes, to explore them as conditions of possibility.

The Aesthetics section therefore develops accounts of computational aesthetics, understood in the broad sense of involving both perception and composition. It proposes software aesthetics as something with specific idiomatic qualities and as something that mixes with and recomposes other cultural forms, and that in turn may condense through the powers of art at a scale that disrupts other forces of aggregation and conformity. But this is a difficult question, one that has to be taken as it comes; and with a dose of precision, whilst understanding that the modality of that precision itself interacts with what it works in the midst of.

The first chapter in this section, ‘Always One Bit More’, is a reading of theories of the experience of mathematical calculation, particularly found in the Intuitionist movement, an influence on the work of Alan Turing, as a means of reading computational artefacts and games such as Minecraft. Brouwer, a key theorist of Intuitionism, and, in certain texts, an advocate of mathematics as a mode of cosmic understanding more broadly, argues for calculation as an experiential form of life. This chapter argues that some of the ways in which computer games have been taken up and reworked by players echoes this thick involvement in numerical transitions. Like other texts in this book, one of the underlying arguments is that technical, and mathematical, documents, ideas and movements have notions of culture baked into them. Part of the work of culture is to draw these out and to see how they exceed themselves.

Building on such work, the chapter entitled ‘Computational Aesthetics’ is a somewhat programmatic statement of the impact and significance of the qualities of computation as a set of aesthetic principles. Whilst ineluctably unfinished as a statement of such a vast set of things, the mode of analysis offered in this chapter proposes grounds for critical work with computing in a way that may encourage taking its specificity into consideration. The chapter proposes a subset of characteristic attributes of computing that have particular significance in establishing its aesthetic valences. Setting out the propositions for such a set of traits implies neither the re-description of the contents page of a basic manual of computer science in more lofty terms, nor simply the correlating of existing and established entities in aesthetic theory with those that establish the circuits of computing, but instead seeing what might pass through and change such a set of terms from both computing and aesthetics and in doing so establish their translation and reconstitution.

This specificity also of course involves complications of scales of experience and analysis, and their conjunction with a massive array of movements and dynamics moving across and as them. The following chapter, ‘Phrase’, develops such concerns in relation to the way in which computational forms interweave with other aspects of material life, ranging from dance to voting systems. How, in turn, might we think through these conjunctions as temporary or enduring aggregates that form themselves as, in a term taken from dance, phrases of movement that cross between things as movements of movement? Such transitions that aggregate then dissipate as a phrasal entity are particularly articulated in technological forms that thrive on conjunction.

Following such an approach, ‘Feral Computing’ elaborates an argument drawing on the question of ubiquitous and distributed computing in relation to the implications of recent developments in cognitive science, specifically those of the enactive, embodied, ecological and extended mind. Aesthetics is founded exactly where one is at this moment, but it also starts in the transitions across the gap between such a position and the movements and reflections that it takes to realize this state as a condition, and that therefore ungrounds itself. As computing moves into distributed forms that draw in spatial characteristics and conditions, and also starts to integrate with things that are more characteristically described in relation to thinking processes, it becomes wilder, begins to leak out of its boxes in ways that are similarly ungrounded. But calculation, experience and thought begin to have spatial characteristics. It is their particular texture as such that they gain the capacity to move from a model of computing as merely a closed mechanism of calculation to one that grounds itself in interaction. It is interactivity as an undergirding drive in the history of computing that, in this text, provides the means for computing to go beyond itself and into a further stage.

The condition of interaction has numerous modes. It also involves some that are more lackadaisical, networks of dawdling, skiving, goofing off, mass enactments of parodies of productivity staged via purposefully pointless protocols. This condition is explored in ‘Just Fun Enough To Go Completely Mad About’, which develops an account of recent debates in software studies by a reflection of the role of algorithms and processes in games. By contrast to the common argument for theorization of games by militantly enthusiastic gamers (whom I admire but can't keep up with), it adopts a more spaced-out, daydreamy approach, one that takes pleasure in phenomena where logic, procedure and paradox are enacted by myriads of users. The two games discussed in this chapter are Agar.io and Twitch Plays Pokemon. Both embody, in different ways, an approach to the intercalation of rules, spaces, movement and interaction, amongst other things, that take them at a tangent. The latter game, or more properly, an event within the conjunction of two gaming systems, was remarkable in playing out some of the more recondite pleasures of play. The embrace of the ludicrous, indeed a founding of logical relations under the fullness of its moon, are what define and drive such games.

The pair of texts under the title Powers addresses the way in which computational forms establish new grammars and forces of power, but also work to revise and reconstitute prior imaginaries of virtue. Such virtues may be found in conditions not only of communication, of direct manipulation of and access to data, of clarity of design and structure, but also of the reinvention of a broader sense of relation between the ideal user, or subject, and the world at large.

‘Black Sites and Transparency Layers’ analyses the formations of transparency, openness and their manifestation in recent ‘flat design’ interfaces and the changed human–computer interaction (HCI) of the smartphone and tablet, the new architecture of Silicon Valley company HQs, and the relation of these forms to the black sites of surveillance centres, server farms and related places. The chapter proposes that each grammar of transparency set in place by such systems and imaginaries also installs certain conditions of opacity that, whilst not inevitable, are unreflected upon or disavowed. No system of transparency-formation is identical to any other, each bringing its own intentional and incidental plays of shadow and light that are also not immediately parseable into a calculus of black and white. The essay proposes some diagnostic means to find the ways in which contemporary design, at the level of interface, architecture and process, articulates what becomes transparent, to what or to whom and how, and in what ways the pleasures and reasons of transparency are enforced or avowed. At present, Silicon Valley is in a moment of its pomp. The ways, after the revelations of Edward Snowden (that follow on those of the preceding years of the Five Eyes and Echelon systems), in which it inscribes transparency as an official good, in devices ranging from its operating systems to its palaces, are therefore instructive.

Following this, ‘Algorithmic Tumult and the Brilliance of Chelsea Manning’ attempts to map the political-computational condition of the present in relation to the themes of posthumanism, related to the feminist work of Rosi Braidotti, and in particular to the kinds of agency, understanding and skill deployed and embodied by Chelsea Manning. The article proposes that there is a particular conjuncture between two universal systems of equivalence: the Turing Machine – which integrates all symbol-based processes and media, whilst re-composing them in ways that are peculiar to networked and computational digital media – and capital, which providentially allows for the imaginary transduction of all entities, states and processes into numerical and tradeable equivalents. The particular conditions of this conjuncture are primary drivers of contemporary life in which new means of action and subjectivation are both possible and urgently needed.

Part of the aim of this book then is to show that if computing is going to saturate everything, which seems tendentially, if by no means absolutely, to be one traceable trajectory for its unfolding, then as a field, as a science and as a manifold set of practices of numerous kinds, it can either be thought to reduce everything to its own terms and conditions, or generate a sufficiently complex relation to what it meshes with to recognize that it is, in turn, going to be mutated. One of the fields of such mutations is culture, something not reducible to the mere rubric of ‘digitization’. As a young science with, so far, only a few basic tenets, but with an enormous complexity of things arising from them, computing has much more to go through. This book attempts to map some of the ways this conjugation of culture and computing is underway.

1The Obscure Objects of Object Orientation

Matthew Fuller and Andrew Goffey

Object orientation names an approach to computing that views programs in terms of the interactions between programmatically defined objects – computational objects – rather than as an organized sequence of tasks embodied in a strictly defined ordering of routines and sub-routines. Objects, in object orientation, are groupings of data and the methods that can be executed on that data, or stateful abstractions. In the calculus of object-oriented programming, anything can be a computational object, and anything to be computed must so be, or must be a property of such. Object-oriented programming is typically distinguished from earlier procedural and functional programming (embodied in languages such as C and Lisp respectively), declarative programming (Prolog) and more recently component-based programming. Some of today's most widely used programming languages – Java, Ruby, C# – have a decidedly object-oriented flavour or are explicitly designed as such, and whilst as a practice of programming it has some detractors it is deeply sedimented in both the thinking of many computer scientists and software engineers and in the multiple, digital-material strata of contemporary social relations.

This essay explores some aspects of the turn towards objects in the world of computer programming (a generic term that incorporates elements of both computer science and software engineering). It asks what powers computational objects have, what effects they produce and, more importantly perhaps, how they produce them. Seeking to situate the technical world of computer programming in the broader context of the changing composition of power within contemporary societies, it suggests a view of programming as a recursive figuring out of and with digital materials, compressing and abstracting relations operative at different scales of reality, composing new forms of agency and the scales on which they operate and create through the formalism of algorithmic operations. This essay thus seeks to make perceptible what might be called the territorializing powers of computational objects, a set of powers that work to model and re-model relations, processes and practices in a paradoxically abstract material space.1

Computation has seen broad and varied service as a metaphor for cognitive processes (witness the ongoing search for artificial intelligence), on the one hand, and as a synecdoche of a mechanized, dehumanized and alienated industrial society on the other – flip sides of the same epistemic coin. As such it might appear somewhat divorced from the rich material textures of culture and a concern with the ontological dimension of ‘things’. Indeed, with its conceptual background in the formalist revolution in mathematics initiated by David Hilbert, computing may not seem destined to tell us a great deal about the nature of things or objects at all. In the precise manner of its general pretension (insofar as one can talk about formalism ‘in general’) to universality, to be valid for all objects (for any entity whatever, in actual fact), formalism is by definition without any real object, offering instead a symbolic anticipation of that which, in order to be, must be of the order of a variable. Objects and things, the variety of their material textures, their facticity, tend to transmogrify here into the formal calculus of signs.

Little consideration has been given to the details of the transformative operations that computer programming – always something a bit more sophisticated than simple ‘symbol manipulation’ – is supposed to accomplish, or to the agency of computational objects themselves in these transformations. In this article, we take up this question through a brief consideration of object-oriented programming and its transformative effects. We read computational formalism through the techniques and technologies of computing science and software engineering, to address object orientation as a sociotechnical practice. As such a practice, one that bears a more than passing resemblance to the kinds of means of disciplining experimental objects and processes that are described by Andrew Pickering, the effective resolution of a problem of computation is a matter of the successful creation, through programming, of a more or less stable set of material processes – within, but also without, the skin of the machine.2

Languages of Objects and Events

To understand the transformative capacities of computational objects entails, in the first instance, a consideration of the development of programming languages, for it is with the invention of programming languages that the broad parameters of how a machine can talk to the outside world – to itself, to other machines, to humans, to the environment and so on – are initially established. Languages for programming computers, intermediating grammars for writing sets of statements (the algorithms and data structures) that get translated or compiled into machine-coded instructions that can then be executed on a computer, are unlike what are by contrast designated as natural languages. They are different not only in the sense that they have different grammars, but also in being designed, in a specific context, as a focused part of a particular set of sociotechnical arrangements, a constellation of forces – machines, techniques, institutional and economic arrangements and so on. A programming language is a carefully and precisely constructed set of protocols established in view of historically, technically, organizationally etc. specific problems. Usually designed with a variety of explicit considerations – mostly technical but sometimes aesthetic – in mind, programming languages themselves nevertheless register the specific configuration of assemblages out of which they emerge and the claims and pressures that these generate, even as they make possible the creation of new assemblages themselves. The computer scientist, it might be said, ‘invents assemblages starting from assemblages which have invented him [sic] in turn’.3

The project of object orientation in programming first arises with the development of the SIMULA language. SIMULA was developed by Kristen Nygaard and Ole-Johan Dahl at the Norwegian Computing Centre in the early 1960s.4 As its name suggests, it aimed at providing a means both to describe – that is, program – a flow of work, and to simulate it. The aim of such simulation was to bring the capacity to design work systems – despite their relative technical complexity – into the purview of those who made up a workplace. As such, the project had much in common with other contemporaneous developments in higher-level computing languages and database management systems, which aimed to bring technical processes closer to non-specialist understanding.5 Equally, they were a way of bringing formal description of the world out from under the skin of the computer. SIMULA aimed to bring the expert knowledge of the programmer into alliance with the decision-making systems of the workplace in which a social democratic version of workers' councils6 guided the construction of the staging of work. This tendency arose from what would later become known as participatory design, but it also had its roots in a version of operational research,7 in which the analysis of work was carried out in order to reduce stressful labour.8

The first version of SIMULA, SIMULA 1, was developed not with a view to establishing object orientation as a new format for programming languages per se but rather as an efficacious way of modelling the operation of complex systems. Simulation of work processes becomes desirable as systems develop to a level of complexity that makes understanding them a non-trivial task, and in SIMULA, it was a task that was initially understood as a series of ‘discrete event networks’9 in which inventory, queuing, work and materials processes could be modelled, and in which there would be a clear correlation between the features of the work process and the way in which it was modelled as an ensemble of computational objects. To produce such epistemically adequate models of complex processes, the tacit ontology of discrete event networks utilized by SIMULA (the ‘network’ was eventually dropped) was one in which real-world processes were understood in terms of events – or actions – taking place between entities, rather than as permanent sets of relationships between them. The language was supposed itself to force the researcher using it to pay attention to all aspects of the processes being considered, directing such attention in appropriate ways.

Initially the language had little impact for general programming; indeed, it wasn't until SIMULA67 was developed that the technical feature that has since been so important to object-oriented programming – the capacity to write programs in which entities combine data and procedures and retain ‘state’10 – was first sketched out, in terms of the possibility of specifying ‘classes’ and ‘subclasses’, each of which would be able to initiate or carry out particular kinds of actions. SIMULA itself did not really see full service as a programming language, because the computing resources required to enable it to do the work required of it meant that it was not especially effective. At such an early stage in the history of programming and programming languages, the possibility of having a language that would be materially adapted to describing real-world processes inevitably rubbed up against the practical obstacle of writing efficient programs.

However, the innovations of SIMULA in the matter of providing for structured blocks of code – called classes – that would eventually be instantiated as objects – were taken up a decade and a half later in the development of C++, a language developed in part to deal with running UNIX-based computational processes across networks,11 something which later also became a driver in the development of a further object-oriented programming language, Java. Its tacit ontology of the world as sequences of events (or actions), the design principles built into it, its links with a different imaginary and enaction of the organization of labour, are indicative of a possibility of a programming that is yet to come.

A variant history of object orientation can be told through the development of the Smalltalk language at Xerox PARC, Palo Alto, in the 1970s and 1980s, under the leadership of Alan Kay.12 Smalltalk frames the nature of objects through the primary question of messaging between the entities or objects in a system. In fact, it is the relations between things that Kay ultimately sees as being fundamental to the ontology of Smalltalk.13 Objects are generated as instances of ideal types or classes, but their actual behaviour is something that arises from the messages passed from other objects, and it is the messages (or events; the distinction is relatively unimportant from the computational point of view) that are actually of most importance. Here, although the effective ordering in which computational events are executed is essentially linear, such an ordering is not rigidly prescribed and there is an overall sense of a polyphony of events and entities in dynamic relation. Kay adduces a rationale for the dynamic relations that Smalltalk sought to construct between computational objects by referring to the extreme rigidity and inflexibility of the user interfaces that were available on the mainframe computers of the time: an approach organized around computational objects allows for a more flexible relationship between the user and the machine, a relationship that would later be glossed by one of Kay's colleagues as allowing for the creative spirit of the individual to be tapped into.14