Devil in the Stack - Andrew Smith - E-Book

Devil in the Stack E-Book

Andrew Smith

0,0

Beschreibung

Throughout history, technological revolutions have been driven by the invention of machines. But today, the power of the technology transforming our world lies in an intangible and impenetrable cosmos of software: algorithmic code. In a world increasingly governed by technologies that so few can comprehend, who-or what-controls the future? Devil in the Stack follows Andrew Smith on his immersive trip into the world of coding, passing through the stories of logic, machine-learning and early computing, from Ada Lovelace to Alan Turing, and up to the present moment, behind the scenes into the lives - and minds - of the pioneers of the 21st century: those who write code. Smith embarks on a quest to understand this sect in what he believes to be the only way possible: by learning to code himself. Expansive and effervescent, Devil in the Stack delivers a portrait of code as both a vivid culture and an impending threat. By turns revelatory, unsettling and joyously funny, this is an essential book for our times, of vital interest to anyone hoping to participate in the future-defining technological debates to come.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern
Kindle™-E-Readern
(für ausgewählte Pakete)

Seitenzahl: 672

Veröffentlichungsjahr: 2024

Das E-Book (TTS) können Sie hören im Abo „Legimi Premium” in Legimi-Apps auf:

Android
iOS
Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



 

Also by Andrew Smith

Moondust

Totally Wired

Andrew Smith has worked as a critic and feature writer for the Sunday Times, the Guardian, the Observer and The Face, and has penned documentaries for the BBC. He is the author of the internationally bestselling book Moondust, about the nine remaining men who walked on the moon between 1969 and 1972, and Totally Wired. He was raised in the UK and currently lives in Brooklyn.

First published in the United Kingdom in 2024 by Grove Press UK, an imprint of Grove Atlantic

First published in the United States of America in 2024 by Atlantic Monthly Press, an imprint of Grove Atlantic

Copyright © Andrew Smith, 2024

The moral right of Andrew Smith to be identified as the author of this work has been asserted by him in accordance with the Copyright, Designs and Patents Act of 1988.

“Aubade,” copyright 2014 by Philip Larkin; from The Complete Poems by Philip Larkin. Used by permission of Faber and Faber Ltd. All rights reserved.

Drawings throughout by Ron Jones.

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of both the copyright owner and the above publisher of the book.

No part of this book may be used in any manner in the learning, training or development of generative artificial intelligence technologies (including but not limited to machine learning models and large language models (LLMs)), whether by data scraping, data mining or use in any way to create or form a part of data sets or in any other way.

1 3 5 7 9 8 6 4 2

A CIP record for this book is available from the British Library.

Trade paperback ISBN 978 1 80546 300 9

E-book ISBN 978 1 80471 081 4

Printed in Great Britain

Grove Press UK

Ormond House

26–27 Boswell Street

London

WC1N 3JZ

www.groveatlantic.com

 

 

Earlier physicists are said to have found suddenly that they had too little mathematical understanding to cope with physics; and in almost the same way young people today can be said to be in a situation where ordinary common sense no longer suffices to meet the strange demands life makes. Everything has become so intricate that mastering it would require an exceptional intellect. Because skill at playing the game is no longer enough; the question that keeps coming up is: can this game be played at all now and what would be the right game to play?

—Ludwig Wittgenstein, 1937

I wish to God these calculations had been executed by steam.

—Charles Babbage

Contents

Prologue 0: If

Prologue 1: Then

CHAPTER 1

Revenge of the SpaghettiOs

CHAPTER 2

Holy Grail

CHAPTER 3

PyLadies and Code Freaks

CHAPTER 4

Minutely Organized Particulars

CHAPTER 5

The Real Moriarty

CHAPTER 6

The New Mind Readers

CHAPTER 7

Theories of Memory

CHAPTER 8

Hilarity Ensues

CHAPTER 9

Catch 32

CHAPTER 10

A Kind of Gentleness

CHAPTER 11

The Gun on the Mantelpiece

CHAPTER 12

Code Rush

CHAPTER 13

Enter the Frankenalgorithm

CHAPTER 14

Algorave?

CHAPTER 15

A Codemy of Errors

CHAPTER 16

Do Algos Dream of Numeric Sheep?: An AI Suite

CHAPTER 17

Apologies to Richard Feynman

CHAPTER 18

A Cloud Lifts

CHAPTER 19

Strange Loops and Abstractions: The Devil in the Stack

Acknowledgments

Select Bibliography

Index

Notes & Sources available atandrewsmithauthor.com/devil/notes

Prologue 0: If

I remember the moment code began to seem interesting to me. It was the tail end of 2013, and in the excitable tech quarters of New York, London and San Francisco, a cult was forming around an obscure “cryptocurrency” called Bitcoin. We know the story well by now. The system’s pseudonymous creator, Satoshi Nakamoto, had appeared out of nowhere, dropped his ingenious plan for near-untraceable, decentralized money into the web, and then vanished, leaving only a handful of writings and 100,000 lines of computer code behind. Who would do such a thing? And why? Like a lot of mesmerized onlookers, I decided to investigate.

There didn’t seem much to go on, until a chance encounter in a coffee line at a Bitcoin meetup in the East End of London opened me to something new. The man I met was a Finnish programmer. He told me that while Satoshi had taken pains to cover his tracks, there were clues in his code if you knew how to see them. There were also antecedents: the Bitcoin mechanism was a work of genius, but its creator built on the groundwork of others, some of whom he had contacted during development. My adviser pointed me to an Englishman named Adam Back, one of a loose group of cryptographer hacktivists who came to prominence in the 1990s as self-styled “cypherpunks.” I set off on the cryptographer’s trail.

The cypherpunk agenda, when it appeared toward the end of the eighties, was at once simple and complex. Humanity’s impending lurch online would be an epochal gift to anyone with a political or economic incentive to surveil, the rebel cryptographers warned. We stood at a fork in the technological road, with the broadest path pointing to an Orwellian future of industrial scale intrusion and forfeiture of privacy, in which no facet of our lives was too intimate to be colonized by anyone with the right programming skills. To repel the bad actors massing to swarm cyberspace, citizens would need tools in the form of cryptographic software. Cypherpunks aimed to supply these tools.

Essential to online privacy would be a payment system that mimicked the anonymity of cash by making transactions hard to trace. Feverish effort went into designing such a system, but the task was daunting. Code derives its power from being digital, at root numeric and therefore exactly and infinitely reproduceable. How would you make digital money that could be transferred at will but not copied; whose electronic movements were registered on a ledger but without recourse to a corruptible central authority? By the end of the 1990s most Cypherpunks had abandoned the quest as Sisyphean, even if some bright ideas were floated along the way. And the travails were not wasted. Eleven years after Adam Back described a cryptographic spam-filtering algorithm called “hashcash” in 1997, it would star in Satoshi’s dazzling system. Now I learned that Back had been contacted by the Bitcoin inventor—anonymously, he said—to arrange attribution for the prior work.

Back was living in Malta, so we spoke on the phone. He was friendly and, on the surface at least, open, claiming to be as puzzled as me by the mystery of Bitcoin’s founder. Yet the longer we spent sifting the evidence, the more surprised I was to feel my interest pivot from Bitcoin’s wraith-like founder to the cosmos of code “he” inhabited. Up to that moment I knew nothing about computer code save that it consisted of light-speed streams of binary numbers a microprocessor could interpret as instructions. How a human engaged this datastream was obscure to me; how a deluge of numbers became action in physical space was beyond my imagining. Yet here were details being proffered. And they were dumbfounding.

I heard that coders used a range of “languages” to communicate with the machine, and that there were thousands of these human-computer creoles, including a few dozen major ones, that each had its own culture, aesthetic and passionate claque of followers. By this account programming languages were not only communication tools, but also windows into the world, ways of seeing and being with definable and sometimes conflicting epistemological underpinnings. When a programmer aligned with a language, all this baggage came with it. And when they sent a program into the world, the baggage went there too. For these reasons there could be rivalry bordering on animus between communities—a tension coders half-jokingly referred to as “religious wars” on the grounds that no one involved was ever going to change their mind or attachment. If the names of these languages tended to suggest either roses or unconscionably strong cleaning products (Perl, Ruby, COBOL, PHP, Go, Fortran…), their whimsy emerged from an electrifying “open-source” creative model through which coders provided their skills to the community, usually unsung and uncredited and with all results shared, free and owned by all in a way one prominent business executive decried as “communism.”

Satoshi chose a language called C++ for the writing of Bitcoin. This was because the “C” family of languages offered little by way of shortcuts or safeguards for the naive or unwary. A no frills approach made C++ harder to use than most alternatives but consequently faster and more efficient—important in a system its creator hoped would become ubiquitous. One programmer likened C to a shotgun, powerful if it didn’t blow your foot off, while Back and others discussed the Bitcoin code like learned exegetes, citing evidence that C++ was not Satoshi’s “native” language, or that he had learned to code in the 1980s, just as one might with a literary text. Programmers sprinkle comments throughout their code for the edification or amusement of peers, and Satoshi had been found to wander between US and British spellings, suggesting that “he” could be “they.” By the end of my inquiry I tingled with questions about the coder’s singular art.

Neither I nor anyone else figured out who “Satoshi Nakamoto” was at that time—at least not in public. But for me the story didn’t end there. I’d written about Bitcoin and blockchain and imagined my brush with code done until, a couple of months later, someone reached out on Twitter. They knew who “Satoshi Nakamoto” was, they told me, and none of the individuals discussed so far fit the bill. On the contrary, “he” was a trio consisting of two Russian coders led by an Irish mastermind who’d studied computing at a Siberian technological university and now worked for Russian state media. My informant purported to be a Brazilian male model whose previous girlfriend dated one of the Russians. He didn’t know much, he said, but could provide a first name for one Russian and full ID for the Irishman.

The latter was real. And active on social media. Wary but intrigued, I settled in to watch for clues. My Twitter source also existed offline, but when I asked to speak or meet, he stalled before following Satoshi back into the microcosmic ether. Aspects of his story made sense. One of Bitcoin’s key promises to supporters—of the political left and right—was to upend the global financial order, an upheaval likely to serve Russia well. For multiple reasons Bitcoin was easier to credit to a team than an individual. And yet crucial details of what I heard were impossible to affirm or rebut. Needing perspective, I called an ex-colleague who now edited BBC TV’s flagship current affairs program. Like me he was cautious, suspicion complicated by inability to see a motive for such an elaborate hoax, if that’s what this was. We arranged for me to meet with Thomas Rid, then professor of Security Studies at Kings College London, whose research and writings had established him as an influential thinker on cybersecurity. Rid pledged to bring a Russia expert associated with the British intelligence agency, GCHQ.

We met in the windswept grounds of Somerset House on the north bank of the Thames, the kind of assignation point favored in the novels of John le Carré and Graham Greene, Cold War thrillers I’d devoured as a child and was disconcerted to find returning to currency. The experts arrived and I described what I was seeing. They listened, probed, tried to find precedents for this situation. None fit cleanly. Rid’s companion saw Russian markers in the choice of a “Brazilian model” as conduit: someone not so exotic as to be absurd but distant enough to resist validation. This meeting was early in 2014, so little was understood about Vladislav Surkov, the Americanophile former theater student and advisor to Russian president Vladimir Putin, a fan of Black Sabbath and Tupac Shakur who quoted Ginsberg by heart while treating efforts to demolish American culture as an intellectual parlor game. To this end Surkov had developed an infowar technique called reflexive control, by which one entered the mind of a foe completely enough to feel their anxieties, fears, longings and delusions, learning to tweak these vulnerabilities until the target was ready to turn on itself. In this scheme, anything that undermined trust in institutions like the media was worth pursuing. Planting stories to be debunked was an anchor of Surkov’s strategy.

But the big picture wasn’t focused yet. We decided the Russian secret service was directly or indirectly behind the Twitter approach, aiming to hone the mystique of a leader who had made his country a poster child for kleptocratic dysfunction. So I reported back to the BBC and we let the matter drop. Only later did I see how my focus on Bitcoin blinded us to the story staring us in the face and primed to define the next decade. Just as the cypherpunks had predicted, computer code was seeping unchallenged and at an accelerating rate into every area of our existence. Within a few short years almost nothing any of us did would happen without it. And the world didn’t seem to be getting better.

Prologue 1: Then

Over the next few years my attention was repeatedly drawn back to the demimonde I’d glimpsed through Bitcoin. The exoticism of code culture and its haunting alien logic remained on my mind, but more compelling were suggestions of a causal link between it and a fast-deranging human environment. The flood of code had intensified, and while most of the software embodying this code did things we liked, a growing proportion allowed online terrorists to spread viruses; businesses to cheat regulation; criminals to coin new forms of theft; data-hoarding digital monopolies to grow like mold and replace settled industries with precarious, loss-making ones, leaving civil society febrile and unnerved. The electoral chaos of 2016 showed how little we understood the quasi-occult technical powers now vested in a few hands—and how the work of these unseen hands could combine erratically at scale. By 2018 and a near perpetual slew of scandals, it was clear society had a problem with the software being written to remake it. From certain angles, life could appear to be getting worse in eerie proportion to the amount of code streaming into it.

What was the problem, though? One possibility was also the most obvious: it was the cohort writing the code, overwhelmingly white and Asian men, happier in the company of machines than fellow humans, if you believed popular culture and TV shows like Silicon Valley, Mr. Robot and Russian Doll. The main characters in these fictions mostly conformed to a category of code savant called the “10Xer,” meaning someone with ten times the average productivity, who could make a computer do extraordinary things while struggling to connect with people—the implication being that these two traits were linked. Was the misfit nerd stereotype accurate? And how did it relate to the fact that while most professions had moved toward openness and diversity since the 1980s, code had charged determinedly in the opposite direction, reaching a nadir in 2015 when only 5 percent of programmers identified as women and fewer than 3 percent were Black . . . a low from which it has barely shifted? Coders were rebuilding our world for us without having posted a spec sheet: Could they be consciously or unconsciously recasting it in their own image, privileging their needs and assumptions above others? Would it be more surprising if they were, or weren’t? And did any of this connect to a Silicon Valley ethos springloaded to recklessness? It began to seem important to ask who these destroyers and rebuilders of worlds were, what they thought they were building and why.

Or were coders incidental to our code problem, just functionaries following orders within a debauched business milieu? It was hard not to notice that when UN investigators charged Facebook with abetting genocide in Myanmar in 2018—in addition to stoking fatal violence in India and Sri Lanka—the company’s share price remained buoyant. Division transpired to be remunerative. So, were killer code and its more subtly destructive variants predictable products of an irrational system? Would fixing the system do the same for its code? In this thought lay hope.

But between coders and the industry they served lay another possibility. I was beginning to hear murmurs that tech beasts like Facebook and Google no longer knew exactly how their algorithms worked or would play with others in the wild. More unnerving still, one of the clearest drifts in automating societies was toward polarity, antithesis as base state, in a way that invited comparison to the binary yes-no, true-false, zero-or-one code underneath. “Programmers like clearly defined boundaries between things,” one practitioner explained to me, and while I didn’t yet understand why this was, I was not alone in seeing a symmetric shift to rigidly defined boundaries between people and things in the social sphere—nor in considering it mystifying and destructive. The proliferation of software and bifurcation of society could be coincidence, I thought. But what if there was some hidden glitch in the way we compute that would inevitably discompose us? Which led to a pair of even more awkward questions. One, how would we know? And two, what would we, could we, do about it?

The software being written by a remote community of coders was reshaping society more dramatically than any technology since the steam engine. I was curious about how binary digits meshed with the world; how numbers became actions; whether this was the only way to compute. More urgent, however, was concern that ignorance of the digital domain’s workings compromised my—and others’—ability to examine its expanding role in our lives as the lines between technology, politics, bureaucracy and personal space pixelated away to nothing. Over time I came to believe that the only way to communicate on equal terms with the mavens encoding the parameters of my life was to follow them Pied Piper-like into the microcosmos they were creating. And the only way I could see to follow them was by learning to do what they did. Or at least trying. Was a 10X-shaped “coding mind,” conferred at birth, required to make sense of the microcosmos? I guessed I was about to find out.

In retrospect I should have had more qualms about entering the domain of code. But the qualm I did entertain was big. Programming was notorious as a difficult endeavor that most novices walked away from. Assuming I could learn (an assumption I was in no way entitled to make), would computing’s binary logic force me to become more binary? This was not a trivial concern. My life and work had been built on shades of gray and lateral motion of thought; on challenging “clearly defined boundaries” where I saw their semblance. Would immersion in code reprogram my cerebral operating system, narrowing its scope? In hopes of understanding any changes, I contacted a German team researching how the brain treats this weirdest new input, offering myself up to their study. To my delight the offer was accepted. If I was going to be digitized, I could at least do it consciously and in a way that shed light on the process.

Revenge of the SpaghettiOs

I had nothing to offer anybody except my own confusion.

—Jack Kerouac, On the Road

The first six months I spend around code are the most disorienting of my life. If you want to communicate with a computer, you must learn one of the programming languages used to translate between them and us. A daunting challenge, you think. Until you encounter the hair-tearing torment of choosing a starter language in the first place, a task that by some trademark coder alchemy turns out to have no obviously right answer and yet hundreds of wrong ones. Credible guesses at the number of these human-machine argots, most written from some bewildering mix of altruism, curiosity and mischief, run from 1,700 to 9,000, with a precise figure no more calculable than species of bacterium in Earth’s soil.

I spend weeks trawling the web for guidance on where to start; consult every programmer I know or have connection to—find all either hedging or contradicting what the last source said. Online threads seem to spin away forever without resolution, often exploding into rancor for reasons I can’t discern, until I feel like a child watching my parents argue, wondering if what I’m hearing makes sense even to them. At first I hear just a litany of names linked to reputed attributes, demerits and specializations, none of which I am in a position to assess. With nothing to hold on to, I enroll in online bootcamps and courses more or less at random, hoping something will stick. But nothing does.

In this context, freeCodeCamp appears as if borne on a raft of light. Founded in 2014 by an idealistic schoolteacher-turnedprogrammer named Quincy Larson, fCC began with the explicit goal of broadening a shockingly shallow coder gene pool then hitching some of the newly diversified cohort to nonprofit work. To this end he established a free, user-friendly, step-by-step online course, complete with a vast and growing international network of self-organizing local groups and meetups. Larson’s theory was that by rationalizing the learning process and establishing an entry-level “canon” of languages and software development tools to learn, guesswork could be reduced. My sense of gratitude would be hard to overstate.

The fCC curriculum will burgeon in coming years but for now revolves around a trio of languages at the heart of the web—HTML, CSS and JavaScript. Each member of this trinity works inside web browsers to render pages for human consumption. HTML (hypertext markup language) delivers content and structure, while CSS (cascading style sheet) describes how HTML content will look. The optional JavaScript is used to manage dynamics, or how any active features might behave.

HTML turns out to be a thrill. Written at the dawn of the web by Sir Tim Berners-Lee in 1990, its syntax is the digital dancing uncle at the wedding—awkward, but seldom scary. Its building blocks, like headers, subheads, paragraphs and images, are called elements and demarcated by opening and closing tags comprised of angle brackets and forward slashes. For example, a header is enclosed in the tags <h> for opening and </h> for closing. Header tags are also given a number from 1–6 to indicate their size, and anything appearing between them will be CSS code governing stylistic details of that element. Take the statement:

To a code-phobic eye this looks crazed, almost psychotic. But the words and symbols translate as “the element Header 1 (h1) will be rendered in yellow Helvetica, 60 pixels high and read Do coders dream of numeric sheep?” Frightening only if you happen to be afflicted with aesthetic sense.

It becomes apparent that the collegial open-source development paradigm—through which anyone can contribute ideas and code to a project—has a consequence: the languages I am learning betray an almost comic absence of consistency or syntactic grace and a baffling panoply of ways to do things. I also get my first sense of how literal and pedantic computers are at the programming level, with a stray comma being enough to crash a system. One veteran programmer tells me she spent six months hunting a bug in a big program, knowing others had tried and failed, eventually tracing the problem to a dash that should have been an underscore. A shift-devil, she called it. This is why coders crave clearly defined boundaries and there is no such thing as “coding outside the box,” she adds, because whatever occurs on the blind side of intention is a bug. To a coder, safety means straight lines and silos. And the exasperations don’t end there. Sometimes languages become extinct and yet remain in the bowels of large programs as “braindead” or “zombie” code. Modern programmers rappel down, see the old code is there and still doing work even if they don’t know how or why, then creep back to the surface before anything can break. No wonder coders can appear crotchety to civilians. My fast but hacky typing already looks like a vector for future pain.

But there is joy aplenty, too. Being able to manipulate a basic simulated web page in fCC’s code editor is exciting. I soon learn that by simply right-clicking any web page I can inspect and even change its underlying HTML and CSS to alter the appearance of the page. There’s naughty fun to be had going to Google’s homepage and changing “Google Search” to “Google Schmearch,” coloring it purple or rewriting the page in Cockney rhyming slang or jive. Most programmers describe a visceral rush the first time they were able to make a computer do their bidding: to my surprise I feel this too.1

Ability to change the appearance of a web page illustrates a deeper truth. It’s easy for anyone raised on TV to assume all screen images are transmitted in the way of that medium. We now see that this is not how a web page works. When a user visits a site, their browser reaches out to the site’s server and fetches nothing but raw code: there is no page as such. The browser then renders something that makes sense to a human, organizing all the page elements in real time according to instructions contained in the code, fetching specified fonts, images, video and data from wherever it might be on the web. The latter are held in place only for as long as they are being looked at, after which the “page” disintegrates. The real-world equivalent would be if every house on your street concealed a team of builders who raised it when you looked and took it down as you turned away. The question “What is a web page?” sits easily with “What is the sound of a tree falling in the forest if no one is there to hear it?” For the first time I understand why pages could take an hour to draw in the early years of the web. The instantaneity we expect now is what suddenly seems surreal.

Another revelation is that all those dot-coms whose names consist of two or more capitalized words run together, like DoubleClick and AirBnB, are not just trying to be cute: this style is called Pascal case and is commonly used in programming environments, where spaces (“whitespace” to a coder) are not always neutral. Make the first word lowercase (à la freeCodeCamp) and you have camel case. These styles can also be referred to as UpperCamelCase and lowerCamelCase. An identifier consisting of two or more words separated by an underscore (like_this) can be referred to as snake case, while substituting underscores with dashes renders kebab case. If a web developer doesn’t yet know what written content will be used on part of a page, they use placeholder “lorem ipsum” text, drawn from a randomized portion of Cicero’s first-century BC philosophical treatise De finibus bonorum et malorum (“On the Ends of Good and Evil”)—which has been used by typesetters for this purpose since the sixteenth century. You quickly realize you’ve seen this “lorem ipsum” text before; it appears on a web page when its coder forgets to replace placeholder text with finished material.

To begin with, my learning goes well, with evening freeCodeCamp sessions forming a routine I look forward to as I start to have—whisper it—fun. The degree to which code colonizes my mind knocks me sideways: soon I wake up dreaming of CSS and thinking about it when doing other things; “seeing” solutions to problems the way a pianist might visualize melodies on a keyboard. I start to notice my wife Jan watching me with concern and am not sure if this is for me or her. All the same, confidence is high by the time I reach the curriculum’s first solo project, to design and build a tribute page to someone I admire. “I think I’m going to be good at this,” I hear myself declaim to the code gods one night.

Oh yeah? comes the reply. So far the freeCodeCamp course has involved learning basic syntax and ways to perform specific operations, like sizing and centering images, choosing and deploying rows of buttons, embedding links, managing fonts and so on. I thought I understood the fundamentals, but now I look at the blank code editor and something truly bizarre happens: my head empties in a way I have never experienced before. Stumped how to react I look again and—by some yet-to-be-explained quantum effect, I presume—my head empties still more. With the training wheels off I abruptly grasp that I know nothing and spend two whole sessions paralyzed, wondering where to start, feeling as adrift as I have ever felt in my life. So panicked am I by the sudden crash that I resort to radical measures and make my first good choice.

Down the phone from Texas, Quincy Larson of freeCode-Camp does his best to set me at ease, chuckling as he assures me my experience is typical. He forwards a funny graph entitled “Programming Confidence vs. Competence,” which shows the two commodities moving in opposite directions at first, with an early peak in confidence followed by a precipitous fall and slow creep back as genuine competence is acquired.

“I think any able mind can do it,” he tells me in response to my fear that I don’t have the right kind of brain. “I view learning to code as primarily a motivational issue.”

Then he delivers a truth I had never considered, one that explains code’s power and difficulty; its uniqueness.

“The thing that gets lost, and which I think is important to know, is that programming is never easy,” he says. “You’re never doing the same thing twice, because code is infinitely reproducible, so if you’ve already solved a problem and you encounter it again, you just use your old solution. This means that by definition you’re kind of always on this frontier where you’re out of your depth. And one of the things you have to learn is to accept that feeling—of being constantly wrong and not knowing.”

Which sounds like it could be a Buddhist precept. I’m thunderstruck.

“Well, constantly being wrong and out of your depth is not something people are used to accepting. But programmers have to,” he concludes.

Relieved in the way of a rabbit who’s outrun a fox to be flattened by a car, I follow Larson and freeCodeCamp’s advice to join an extraordinary web community called Stack Overflow, the fortieth-most-visited site in the world at writing, where perplexed programmers canvas help from a global community of peers. A forbidding hierarchy of rules attaches to what kind of questions may be asked, and how—Lord save the blushing newbie who submits one that’s been asked before—but most beginner issues have been encountered and dealt with already, folding into a giant database of solutions ranked for popularity and indexed in perpetuity, relieving the need to take pride in hand and frame something new. The question of why so many skilled coders donate time and energy to the community in this way, with no evident benefit to themselves as individuals, will grow to fascinate me. If coders are remaking the world, why doesn’t the world look more like Stack Overflow?

At length, with copious help, the roadblocks fall away and I finish my tribute page to the late Apollo moonwalker Dr. Edgar Mitchell. My first web page is small, basic and likely to provoke unintended nostalgia for 1995 in anyone who was on the web then, but having been through the wringer I feel jazzed to have made it. Including wrong turns and becalmed interludes of despair, the learning process took several months. “So programming is hard but not impossible,” I find myself telling people. Until one of those people drops a bombshell. Their programmer friend says HTML and CSS constitute coding but not programming. Trying to hide my irritation, I tell them they’re wrong—they must have misunderstood what their friend said . . . only for them to come back insisting “No, he says programming is algorithms and there are no algorithms in HTML or CSS, you’re just moving stuff around like fridge magnets—that’s why you found them relatively easy.”

My head swims, first because I didn’t find any of this easy, then with cruel knowledge that even after months of toil I haven’t understood programming well enough to understand how profoundly I don’t understand it. Having always considered myself reasonably capable, code now exposes me as probably no smarter than an economist. It’s humiliating. I probably focused on the languages I did because they’re not algorithmic. Algorithms are where code and the new world encoded by it start to get wacky. There’s no avoiding them any longer.

Algorithm: In the popular imagination this has come to mean a giant, almost supra-human entity that crunches lots of data to deliver things we like (news, music, friends), makes decisions about stuff we want (loans, education, jobs) or is out to get us like a monster from a 1950s B-movie. In reality all computer programs consist of algorithms, which are simply rules for treating data, defined by the word if. The simplest algorithm would consist of the statement “if a is true, then do b; if it’s false, do c.” If/then/else: the binary yes-no, true-false logic of computing, dictated by the eccentric workings of a microprocessor and projected outward from there. If the customer orders size fifteen shoes, then display the message “Out of stock, Sasquatch”; else ask for a color preference. How scary can this be?

I’m in luck, I think. At first. The third pillar of the web development trinity, JavaScript (JS), was written in the mid-1990s with the specific aim of bringing algorithms into a web browser. It has since become ubiquitous. But as I angle into freeCode-Camp’s JS course my mind simply glances off it—a novel sensation for me; different from the one I had with HTML.

The first impediment to my progress is the profoundest. After a life in the fluid province of evolved biology, trying to “think” my way into the awkward sequential modus of the Machine—to feel its logic—is as uncomfortable as anything I’ve ever tried to do. The problem isn’t algorithms as such. We use algorithms all the time in everyday life. If Jan finishes work early enough, then we’ll go see a movie; else we’ll stay home and watch Better Call Saul. But the computer expects every thought and action to be broken down into a sequence of individual instructions in a way that feels unnatural and is ahuman, this being a realm where there is no common sense or intuition to fall back on and everything, literally, is literal. What does this mean? If I say to a dinner guest, “Can you pass the salt, please?” they are probably moving before the question is finished, with multiple processes happening synchronously and no need for further elucidation. Had my guest been possessed by the spirit of a computer, however, my instruction would have sounded more like this:

“Uma: would you please access an image of a salt shaker from memory; scan the table for something similar; henceforth identify that object as “salt_shaker”; calculate whether salt_shaker is within reach; if no, then wonder why I asked you rather than Beyoncé to pass it; else, calculate whether it is nearer your left or right hand; assess whether there is any cutlery in that hand; if yes, then lay it on your plate; move your hand in the direction of salt_shaker; stop the hand over it; lower hand; grasp shaker; etc. . . .”

All while praying no bug impels my guest to hurl the shaker through a window or try to eat it. The further we travel up the stack toward higher-level languages like JavaScript, the more of this pedantry gets hidden. But not enough for my liking. I’m not sure I can think like this, or even want to, and any suggestion that I could learn to enjoy this process seems far-fetched right now. I feel like I’m trapped inside a game of Tetris.

The second assault on my equilibrium sounds petty but is real. Like many programming languages, JavaScript’s syntax is influenced by the utilitarian C, with cluttered flights of brackets, braces and parentheses everywhere and each statement, however short, ending messily in a semicolon. They tell me JS is improving rapidly, but to my present way of seeing it looks like a child barfed a bowl of SpaghettiOs onto a screen. In my daily life I spend a lot of time thinking about syntax, trying to make it simple and clear, always in service to context. Put simply, I hate looking at JavaScript, which can’t augur well for learning to write it.

And so I’m dumped back into what I’ve come to think of as the “coding mind” question: whether code is and always will be the natural habitat of a certain kind of mind, or even brain. Given the dramatic gender, race and class imbalances in the profession, allied to the power coders hold to reshape our lives to their own specification, this thought makes me nervous. It also implies that my rash dream of penetrating code culture was doomed from the start. Craving clarity either way, I approach someone who has thought deeply about the trials of programming. Gerald Weinberg’s book The Psychology of Computer Programming was written as long ago as 1998 but is still considered the subject’s foundational text. Via email I lay bare the disappointment I’ve felt; my fear that an elliptical, lateral-tending mind like mine simply can’t do this. The speed and certainty of his response takes me aback.

“No, there are many ways to approach programming,” he assures me by email. “The important thing for a programmer is to study your own mind and habits, then improve them as needed.” He directs me to a blog post of a few days previously, in which he echoes Quincy Larson by elaborating:

A computer is like a mirror of your mind that brightly reflects all your poorest thinking. To become a better programmer, you have to look in that mirror with clear eyes and see what it’s telling you about yourself. Armed with that information about yourself, you can then select the most useful external things to work on. Those things will be different for you than for anyone else, because your shortcomings and strengths will be unique to you, so advice from others will often miss the mark.

Hence the lack of agreement over the best languages to learn. Hence Jun Wong of Hacker Dojo, the remarkable cooperatively run Silicon Valley startup incubator, confiding to me that while he can’t predict whether a person will be a good programmer from speaking to them, he can venture what kind of coder they will or would be. My poorest thinking? Self-evidently I leap into things before thinking them through. Curious as to how other learners manage their discomfort, I find a slew of blogs and vlogs that combine to suggest coding is like watching The Wire: even future pros can experience multiple false starts. And this no longer surprises me. Over the months my sense has become less of adapting my brain to code’s abnormal demands than of trying to build a new brain to run in parallel with the original . . . which sheds light on the problem but gets me no closer to finding a programming language that will solve it. At a loss what else to do, I revisit my early advisers and find the same patchwork advice as before. Until one day a coding colleague of a friend, a skilled C++ programmer and former musician, smiles sphinx-like on hearing my tale of woe. “I think I might know someone who can help you,” he says.

1 Anyone with access to a computer can try this now: go to Google’s home page, right-click it anywhere and select “Inspect” from the resulting pop-up menu: this will open an inspection bar at the bottom of the screen. From the menu at the top of the bar, click “Inspector” to see HTML code appear immediately underneath, while CSS will show in a sidebar to the far right. Now click the mouse icon immediately to the left of the “Inspect” button and hover your own mouse over any element on the Google page to see the code pertaining to it. Click on the element you want to mess with, go to the CSS and look for keywords like “color” or “font” or “height” and the values bound to them by a colon. Double-click the value (say, a blue color given as “#003eaa”) and type an alternative of your choice (maybe type the word red or gold or aquamarine). The color of that element will change. Better yet, text intended to be “printed,” meaning visible on the page, appears in the HTML, usually colored white and always skewered between the sharp ends of angle brackets (as per >Do coders dream of numeric sheep?< above). Double-click the text and rewrite it to your heart’s content. The superlative Norwegian-run code education site W3Schools offers lists of HTML elements and CSS properties to watch out for and play with.

Holy Grail

Nicholas Tollervey had been one of those kids who wonder how their computer games work; whether it would be possible to customize them and make the machine do other cool stuff with a bit of code, the answer for him being yes. He was also a musician and took that path into adulthood, studying tuba at the Royal College of Music before noticing there were more astronauts in the UK than pro tubists—possibly more conjoined twins and pandas. So, he went back to his other early passions and now has degrees in music, philosophy, computing and education. I will learn that for reasons no one seems able to explain, the proportion of practicing musicians among coders is far higher than in the general population. Even so, Tollervey is no one’s idea of the classic coder cowboy.

With me in the US and he in the UK, we meet on Google Hangouts. I learn that Tollervey has been a pro software developer for almost two decades; has worked on projects for companies like The Guardian, Freedom of the Press Foundation, NHS England, the Council of Europe and BBC; is fluent in a range of software tools and languages including C# and Java-Script. Most important to me, he is an emeritus fellow of the Python Software Foundation (PSF), the US-based nonprofit that oversees development of the Python programming language. A few experienced developers have tried to steer me toward Python, so I explain my predicament and ask if the language might have anything different to offer me. He pauses while weighing his words.

“Well, I don’t want to get into religious wars here,” he chuckles, “so I should say that I enjoy JavaScript too. But yes, there is a reason people are suggesting Python.”

He tells me the story of Python’s emergence. How in the closing days of the 1980s, a Dutchman named Guido van Rossum decided to write a programming language that would be clear, concise and as easy to learn as possible, in which simplicity was paramount and transparency to other coders had the imperative force of a covenant. He named it Python after the British TV show Monty Python’s Flying Circus.

Van Rossum was then working at the renowned Centrum Wiskunde & Informatica in Amsterdam, famous for having nurtured the legendary Edsger Dijkstra, software’s cross between Albert Einstein and Jedi Master Yoda, and so a good base to work from. All the same, brilliant computing minds had dreamt of such a language for decades without success, even if the breadth and invention of their attempts turns out to be one of the great untold stories of the last eighty years, existing at the nexus of math, linguistics, philosophy, psychology, engineering, literature and neuroscience, as spectacular in its way as our first forays into space.

The lack of a definitive programming language turns out to be for a reason. Translating between the digital microcosmos and our analog human minds involves a fiendish tradeoff: make your language quicker and easier for the machine to process and it grows proportionally more alien to humans. Make it accessible and intuitive to us, however, and you commit the machine to lots of extra work—thereby handicapping the central processing unit (CPU) and introducing new layers of complication; of code to go wrong. Computerists use the metaphor of “The Stack” to describe this relationship. Lowest in the stack is machine code, the stuff that happens on a silicon chip, where tiny electrical switches called Logic Gates create and process the binary numbers we use to represent and ultimately manipulate the world. Just above that is Assembly Code, as used by early programmers at NASA and elsewhere, still present in the world but rarely engaged directly. At the very top are beginner languages like the child-friendly pictorial Scratch, where most of the machine’s weirdness is hidden behind a colorful cartoon interface.

So it is that, in a typically precise inversion of everyday logic, coders employ the soubriquet “low-level” to describe the abstruse languages dwelling “close to the machine”—down by the silicon—and “high-level” to denote the more accessible ones further up the stack. JavaScript, Ruby and Perl are high-level, with lots of shortcuts and user-friendly guardrails designed to prevent serious errors. The more bracing C was considered high-level when it emerged from Bell Labs in the 1970s but became mid- to low-level as the stack expanded above it.

Python is a high-level interpreted language, meaning it consists of two parts: the human interface—the symbolic language we use to express our intentions, with its rules and conventions and syntax—and an interpreter that interprets our code for the machine. But its humane high-level visage does not make low-level code redundant: rather, the Python interpreter takes our work, which means nothing whatsoever to a microprocessor, and initiates a process of relaying it down the stack and into machine language. On the way down it will be converted, re-converted and processed several times—a process so labyrinthine and racked with exceptions, caveats and forks that a full accounting of it is hard to glean. Not all languages are interpreted: a lower-level language like C is compiled, meaning that our human source code is translated directly into executable machine code by a compiler, making it harder to use but efficient and therefore fast.

Computerists refer to this mind-bending arrangement as abstraction, an unassuming idea little understood outside programming, whose secret depths and full implications take many months to penetrate; even then will be replaced by nothing more comforting than a schizoid mix of wonder and fear. For now, I experience vertigo when trying to remind myself why this whole digital edifice doesn’t just go poof! and vanish in a mist of flickering digits at any given instant.

I am not alone in being intimidated by the stack. On a trip to New York I drop in on the writer-coder Paul Ford, son of an experimental poet and author of a superb and improbably entertaining, novella-sized Bloomberg essay on code culture titled “What Is Code?” (sample: “Back in the 1980s, while the Fortran programmers were off optimizing nuclear weapon yields, Lisp programmers were trying to get a robot to pick up a teddy bear . . .”). In a conference room at his web design firm, Postlight, he grimaces when asked if even seasoned coders can hold the entire stack in their heads.

“That stuff is not comfortable, is it?” he offers in sympathy with my dislocation. “A mature programmer can go from very high to very low in the stack and explain how the pieces work. But that’s maturity: there are really good software developers who, once you get below the level of what the web browser is doing, have no idea.”

I already have an intimation that answers to the questions I’m asking are buried somewhere in the stack, mysterious as it is to me right now—and that, like a digital echo of Mr. Kurtz in Conrad’s Heart of Darkness, I’m going to have to find my way down there, to where almost no one goes.

No one could have foreseen Python’s future impact when Van Rossum set out to make it. As in showbiz—and to about the same degree—most new language entrants fall away and are forgotten. Written over a couple of years, the earliest Python was slow to catch on. Through the 1990s the coming high-level offerings were the chaotic JavaScript and prismatic Perl, and Van Rossum tells of being at a conference in San Francisco at the turn of the millennium, posting an invitation to a Python meetup and wondering if anyone would show, being relieved when 5 out of 12,000 attendees did. Nonetheless, a small community of believers formed and began to extend the language by building libraries of open source code prewritten to accomplish specific tasks, available to all users as importable modules or packages.

Only as the new century accelerated in tandem with the microprocessors driving it did the sagacity of the Dutchman’s philosophic choices start to show. As software became the world, programs started to grow to a point where collaboration was unavoidable and transparency stopped being a luxury. Programmers would spend less time trying to decipher each other’s code if the language being used was refined and unambiguous. If the lone wolf curmudgeon coder still exists, he is vestigial, Tollervey tells me—his days are numbered.

Had Van Rossum foreseen this shift? Or was he simply trying to make a language he wanted to use; build a community he would like to join? Either way, Python is not encouraging of showboaters: the primacy of communication, collaboration, community is etched into its DNA. Where JavaScript prides itself on a plurality of ways to approach a given problem (and is often likened to a coder Wild West by non-adherents), Python sets an ideal of there being one obvious way to do most things. “Pythonistas” see this attenuated freedom as a small price to pay for the sake of lucidity and accord. One prominent clan member will tell me that “To describe something as ‘clever’ is not considered a compliment in the Python culture.” Making it easy to see how different personality types might be drawn to these very different languages, which appear to embody and express two distinct, and in many ways opposing, worldviews.

“Don’t let anyone tell you code is not political,” Tollervey concludes. “It contains all kinds of assumptions and ways of seeing, and the things we use it to make reflect these. You’ll sometimes hear inexperienced developers boast about how many lines of code they’ve written. Well, a more experienced professional will brag about how many lines of code they’ve removed. An expert is someone who’s able to say, ‘This code doesn’t need to be written—here’s a solution where we don’t need it.’ ”

Pythonistas admit that while their language isn’t best at much, it’s second best at almost everything. Improvements are floated via Python Enhancement Proposals, or PEPs, which anyone can suggest for the community to debate and action or not according to consensus. For most of the language’s life Van Rossum, as officially annointed Benevolent Dictator for Life (BDFL), broke any deadlocks with a casting vote. A seminal PEP, PEP8, the Python Style Guide, was written by him in 2001. Another, PEP20, The Zen of Python, was contributed by an early enthusiast named Tim Peters and consists of nineteen declarations of principle. Zen is built into the language itself, ready to be called like a priest in dark moments. It begins:

Beautiful is better than ugly.

Explicit is better than implicit.

Simple is better than complex.

Complex is better than complicated.

Further down is a playful nod to Van Rossum:

There should be one—and preferably only one—obvious way to do it.

Although that way may not be obvious at first unless you’re Dutch.

And my own favorite:

Now is better than never.

Although never is often better than *right* now.

A sentiment starkly at odds with the “move fast and break things” dogma coders at dysfunctional Big Tech firms have used to blow up so many functional predecessors, most especially those inclined to move slowly and fix things.

Tollervey laughs at my surprise in finding these thoughts embedded in a programming language, but by the end of our conversation I am sold. We discuss potential learning projects and I mention the idea of building a mini app to scrape Twitter for mentions of an author’s books—my variation on a common beginner’s project. He agrees this is doable and offers help if I need it, and for the first time in weeks I feel a glimmer of optimism. He also floors me with a suggestion: if I really want to see code in action and get a sense of who coders are and what they do, I should consider enrolling in the annual PyCon jamboree, which will host 4,000 Pythonistas, including him, in Cleveland. And before I can sweat the ramifications of such a tumble into the unknown, I hear a voice, belatedly identifiable as mine, agreeing.

PyLadies and Code Freaks

Five a.m. The rail hub at Hopkins International Airport. Tired. Staring. How is it that while trains work the same everywhere, rail operators’ ticket machines all find fresh ways to mess with your mind? A young man next to me stands frowning at the card the machine just spat back for the third time. It turns out he’s here for his first PyCon, too. United by tiredness and irritation, we catch the train into Cleveland together.

Alex is from Seattle, a rank-and-file coder with one of the field’s several classic origin stories. When a post-high-school life in fast food proved all it wasn’t cracked up to be, he decided to try programming, and for several years drifted through languages including Ruby, JavaScript, Java and PHP without gaining much purchase, until he doubted his coding Right Stuff. Then he tried Python and liked it. These days he tends websites with the Python web framework Django.

As our train rattles through the outskirts of Cleveland toward a soft buttercup sun, this sounds like a story with a happy ending. But no, Alex says. The company he works for develops “AI” for use in fracking, an activity he doesn’t approve of, and alternative jobs at his level are hard to land in tech-heavy Seattle. Disillusioned, he is thinking of giving up, with PyCon a last roll of the job-seeking dice. We pull into Tower City Station and exchange numbers, with a promise to catch up at some point during the conference. I roll away hoping he doesn’t give up, because at the very least, he cares.

My tiredness reflects a significant life change. Cleveland turns out to be one of the more awkward journeys to make from the San Francisco Bay Area, to which Jan and I moved in the months prior to PyCon. For the next few years I will be where a large portion of the world’s code is written, including much that is most contentious. With the cities of San Francisco and Oakland too pricey to contemplate we crossed the Golden Gate into the relatively rural county of Marin, to find an offbeat perch in the Bay Area’s last remaining hippie enclave, a place where war stories from Woodstock and Monterey Pop abound and the only recognized crimes appear to be harshing mellows without a license and gluten smuggling. I quickly rue the way all music made after 1974 seems to ionize at the county border. And that, given the proximity of Oakland, the population is anomalously white, with half the townsfolk desperate to see this change and the other (often older) half to varying degrees not: twenty-first century America in tie-dye microcosm, and not uninteresting for that.

Another plus to Marin is being at the opposite end of the majestic San Francisco Bay from Silicon Valley, meaning there is almost no tech penetration—hence the lower prices—so when I need respite from functions and variables I only need step outside to hug a redwood or join an ecstatic dance group. At least that’s what I think until I meet my immediate neighbor, Sagar, a programming sensei of around my own age at a major Valley firm, who on initial meeting seems to fit the classic nerdy coder stereotype to an almost comic degree, but soon proves lightyears from it—a dynamic I will grow accustomed to and never stop being amused by.

The word windswept could have been coined for downtown Cleveland. As in other late capitals of mechanical industry, city burghers spared no expense in celebrating themselves with wide roads and long blocks and decorative open squares; with a legacy of grand civic buildings no one seems to enter or exit—relics of a time when power was projected outward rather than concealed in streams of numbers. For someone with no personal connection to Ohio I’ve spent a spooky amount of time in it, yet this is my first visit to post-industrialization’s complicated Rust Belt exemplar. It’s true what they say about Midwesterners being open and friendly by default, and I like the place instantly, even while wondering if Silicon Valley will be regarded similarly one day, as history implies it must, and what such a turn might look like when it comes.

Any such moment of reckoning feels infinitely remote inside the vast, subterranean Huntington Convention Center, which hosts Thursday night’s PyCon reception party. If I’d expected the beard-stroking formality of most professional conferences, what I find more resembles a first day back at Hogwarts. I text Nicholas Tollervey and move to the cavernous main exhibition hall, where Pythonistas hug and exchange warm greetings while they drink and eat and buzz excitedly around sponsors’ stands. To the fore I see the capacious berths of tech behemoths like Google and Facebook; behind me are a hive of startups whose tech-generic two-syllable names—Hosho!, Zulip!, Tivix!, Nexmo!—perennially suggest the work of either randomness or a committee of overtired toddlers. All, large and small, declare love for Python and an eagerness to recruit. At first I am flabbergasted to find recruitment pitches disproportionately directed at me, before realizing my age implies experience and/or influence. By the time I decide enjoying this unearned status may be a soupçon ethically louche, it’s too late: I swagger off to meet Tollervey as the proud if confused owner of three job offers. Oh to be a programmer in the twenty-first century.