11,49 €
'Boldly reactionary...What looks like feast, Carr argues, may be closer to famine'Sunday Times 'Chilling' The Economist In this ground-breaking and compelling book, Nicholas Carr argues that not since Gutenberg invented printing has humanity been exposed to such a mind-altering technology. The Shallows draws on the latest research to show that the Net is literally re-wiring our brains inducing only superficial understanding. As a consequence there are profound changes in the way we live and communicate, remember and socialise - even in our very conception of ourselves. By moving from the depths of thought to the shallows of distraction, the web, it seems, is actually fostering ignorance. The Shallows is not a manifesto for luddites, nor does it seek to turn back the clock. Rather it is a revelatory reminder of how far the Internet has become enmeshed in our daily existence and is affecting the way we think. This landmark book compels us all to look anew at our dependence on this all-pervasive technology. This 10th-anniversary edition includes a new afterword that brings the story up to date, with a deep examination of the cognitive and behavioural effects of smartphones and social media.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Veröffentlichungsjahr: 2010
“The thesis of The Shallows is simple and persuasive. Brain scientists have known for decades about the phenomenon of neuroplasticity: in simple terms, that the things we do have a physical effect on the structure of our brains. . . . Carr’s scope in this unceasingly interesting book is wider than just the synapse and the transistor. . . . What looks like feast, Carr argues, may be closer to famine.”
—Sam Leith, Sunday Times
“The most readable overview of the science and history of human cognition to date. . . . Carr draws some chilling inferences.”
—The Economist
“To his great credit, Carr is as even-handed as possible. He consistently emphasizes the fact that screen technologies are neither evil nor miraculous in their effects on the human mind: rather, for every talent lost or diminished, another will be gained or enhanced. What is certain, however, is that our minds will change. . . . The Shallows is a worthy illustration that books do, indeed, enable deep reflection.”
—Susan Greenfield, Literary Review
“An elegantly written cry of anguish. . . . Hair-raising.”
—John Harris, Guardian
“Carr straddles the book-dominated and web-dominated worlds and is at home in both. . . . Mild-mannered, never polemical, with nothing of the Luddite about him, Carr makes his points with wide-ranging erudition.”
—Christopher Caldwell, Financial Times
“A thought-provoking exploration of the Internet’s physical and cultural consequences, rendering highly technical material intelligible to the general reader.”
—2011 Pulitzer Prize Committee
“In his new book, The Shallows, Nicholas Carr has written a Silent Spring for the literary mind.”
—Michael Agger, Slate
“We are living through something of a backlash against the frenzy of attention dispersion, a backlash for which Carr’s book will become canonical.”
—Todd Gitlin, New Republic
“Carr is a great writer. . . . This is a must-read for any desk jockey concerned about the Web’s deleterious effects on the mind. Grade: A.”
—Newsweek
“Essential reading for our internet age.”
—New York Times Book Review
“This is a book to shake up the world."
—Ann Patchett, author of Bel Canto and The Dutch House
“Absorbing [and] disturbing. We all joke about how the Internet is turning us, and especially our kids, into fast-twitch airheads incapable of profound cogitation. It’s no joke, Mr. Carr insists, and he has me persuaded.”
—John Horgan, Wall Street Journal
“I have not only given this book to numerous friends, I actually changed my life in response to it.”
—Jonathan Safran Foer, author ofExtremely Loud and Incredibly Close and Eating Animals
“Carr provides a deep, enlightening examination of how the Internet influences the brain and its neural pathways. . . . His fantastic investigation of the effect of the Internet on our neurological selves concludes with a very humanistic petition for balancing our human and computer interactions. . . . Highly recommended.”
—Library Journal, starred review
“Carr’s fresh, lucid, and engaging assessment of our infatuation with the Web is provocative and revelatory.”
—Booklist
“Carr wants us to think deeply about the effects of this new technology on our cultures, our brains, our social lives and our ways of thinking about knowledge. With masterful ease and winning style, he lays out ideas that will encourage readers to do just that. . . . The Shallows is a book everyone should read.”
—Anna Lena Phillips, American Scientist
“Required reading for anyone who wants a cogent, comprehensive, and thoroughly researched statement of the techno-fears that, in however inchoate a way, many of us have harbored for going on a few decades now.”
—Daniel Menaker, Barnes & Noble Review
“Nicholas Carr has written a deep book about shallow thinking.”
—Daniel J. Flynn, American Spectator
“If you care about your own ability to think and read deeply, please treat yourself to Carr’s book.”
—Carol Keeley, Ploughshares
“Nicholas Carr’s The Shallows is a deeply thoughtful, surprising exploration of our ‘frenzied’ psyches in the age of the Internet. Whether you do it in pixels or pages, read this book.”
—Tom Vanderbilt, author of Traffic
“Witty, ambitious, and immensely readable, The Shallows actually manages to describe the weird, new, artificial world in which we now live.”
—Dana Gioia, poet and former chairman of the National Endowment for the Arts
“Measured but alarming. . . . Carr brilliantly brings together numerous studies and experiments to support this astounding argument: ‘With the exception of alphabets and number systems, the Net may well be the single most powerful mind-altering technology that has ever come into general use.’”
—Will Buchanan, Christian Science Monitor
“Cogent, urgent and well worth reading.”
—Kirkus Reviews
“The Shallows certainly isn’t the first examination of this subject, but it’s more lucid, concise and pertinent than similar works. . . . An essential, accessible dispatch about how we think now.”
—Laura Miller, Salon
“The picture of our intellectual future, rendered thoroughly, convincingly, and often beautifully in Carr’s text, is bleak enough to give any serious mind some serious pause.”
—Patrick Tucker, The Futurist
“If you retain any residual aspirations for literary repartee, prefer the smell of a book to a mouse and, most important, enjoy the quiet meanderings within your own mind that can be triggered by a good bit of prose, you are the person to whom Nicholas Carr has addressed his riveting new book, The Shallows.”
—Robert Burton, San Francisco Chronicle
“The author of ‘Is Google Making Us Stupid?’ returns to his thesis at book-length—but can our web-truncated attention spans handle so much prose? With Carr at the wheel, the answer is a resounding ‘yes.’ . . . The Shallows is a guide for understanding—and even regaining control of—your brain on the internet.”
—Seed
“Carr is an excellent writer. One of those nonfiction writers in the league with people like Malcolm Gladwell and Dan Ariely who can teach and entertain at the same time.”—Jim Randel, Huffington Post“Outstanding . . . a shrewd, compelling overview of how an everchanging, always growing technology has changed us.” —BookPage
“Persuasive. . . . [Carr] cites enough academic research in The Shallows to give anyone pause about society’s full embrace of the Internet as an unadulterated force for progress.”
—Peter Burrows, BusinessWeek
“Another reason for book lovers not to throw in the towel quite yet is The Shallows, by Nicholas Carr, a quietly eloquent retort to those who claim that digital culture is harmless—who claim, in fact, that we’re getting smarter by the minute just because we can plug in a computer and allow ourselves to get lost in the funhouse of endless hyperlinks.”
—Julia Keller, Chicago Tribune
“[Carr] puts his finger on the dark irony of the tech age: In the search for unlimited information and connectivity, we have also provided ourselves with an infinite scope for distraction.”
—Leah McLaren, Globe and Mail
“Nicholas Carr does a wonderful job synthesizing the recent cognitive research. In doing so, he gently refutes the ideologists of progress and shows what is really at stake in the daily habits of our wired lives: the reconstitution of our minds. What emerges for the reader, inexorably, is the suspicion that we have well and truly screwed ourselves.”
—Matthew B. Crawford, author of Shop Class as Soulcraft
“A thought-provoking and intellectually courageous account of how the medium of the Internet is changing the way we think now and how future generations will or will not think. Few works could be more important.”
—Maryanne Wolf, director of the Tufts University Center for Reading and Language Research and author of Proust and the Squid
“Not long ago, Thomas Friedman declared our new electronic world, with its leveled competitive field, ‘flat.’ Now, Nicholas Carr, marshaling scientific evidence, looks at the inner world of our brains, finding the impact of new technology there much the same: flattened memories, imaginations, and thinking capacities. We need this book in a deep way.”
—Eric Brende, author of Better Off
“Nicholas Carr has written an important and timely book. See if you can stay off the Web long enough to read it!”
—Elizabeth Kolbert, author of The Sixth Extinction
ALSO BY NICHOLAS CARR
Utopia Is Creepy
The Glass Cage: How Our Computers Are Changing Us
The Big Switch: Rewiring the World, from Edison to Google
Does IT Matter?
Published by arrangement with W. W. Norton and Company, Inc., New York.
First published in hardback in Great Britain in 2010 by Atlantic Books, an imprint of Atlantic Books Ltd.
This updated paperback edition first published in Great Britain in 2020 by Atlantic Books, an imprint of Atlantic Books Ltd.
Copyright © Nicholas Carr, 2010, 2011, 2020
The moral right of Nicholas Carr to be identified as the author of this work has been asserted in accordance with the Copyright, Designs and Patents Act of 1988.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of both the copyright owner and the above publisher of this book.
Every effort has been made to trace or contact all copyright-holders. The publishers will be pleased to make good any omissions or rectify any mistakes brought to their attention at the earliest opportunity.
“The writing ball is a thing like me…”, from Gramophone, Film, Typewriter by Friedrich A. Kittler, translated by Geoffrey Winthrop-Young and Michael Wutz. Copyright © 1996 by the Board of Trustees of the Leland Stanford Jr. University for translation; © 1986 by Brinkman and Bose. All rights reserved. Used with the permission of Stanford University Press, www.sup.org.
The extract taken from ‘The House Was Quiet and the World Was Calm’, The Collected Poems of Wallace Stevens © The Estate of Wallace Stevens and reprinted by permission of Faber and Faber Ltd.
1 3 5 7 9 10 8 6 4 2
A CIP catalogue record for this book is available from the British Library.
Paperback ISBN: 978-1-83895-258-7
E-book ISBN: 978-1-84887-883-9
Printed in Great Britain
Atlantic Books
An imprint of Atlantic Books Ltd
Ormond House
26–27 Boswell Street
London
WC1N 3JZ
www.atlantic-books.co.uk
to my mother
and in memor y of my father
Introduction to the Second Edition
PrologueTHE WATCHDOG AND THE THIEF
OneHAL AND ME
TwoTHE VITAL PATHS
a digressionon what the brain thinks about when it thinks about itself
ThreeTOOLS OF THE MIND
FourTHE DEEPENING PAGE
a digressionon lee de forest and his amazing audion
FiveA MEDIUM OF THE MOST GENERAL NATURE
SixTHE VERY IMAGE OF A BOOK
SevenTHE JUGGLER’S BRAIN
a digressionon the buoyancy of IQ scores
EightTHE CHURCH OF GOOGLE
NineSEARCH, MEMORY
a digressionon the writing of this book
TenA THING LIKE ME
EpilogueHUMAN ELEMENTS
Afterword to the Second EditionTHE MOST INTERESTING THING IN THE WORLD
Notes
Further Reading
Acknowledgments
Index
Welcome to The Shallows. When I wrote this book ten years ago, the prevailing view of the Internet was sunny, often ecstatically so. We reveled in the seemingly infinite bounties of the online world. We admired the wizards of Silicon Valley and trusted them to act in our best interest. We took it on faith that computer hardware and software would make our lives better, our minds sharper. In a 2010 Pew Research survey of some 400 prominent thinkers, more than eighty percent agreed that “by 2020, people’s use of the Internet [will have] enhanced human intelligence; as people are allowed unprecedented access to more information, they become smarter and make better choices.”1
The year 2020 has arrived. We’re not smarter. We’re not making better choices.
The Shallows explains why we were mistaken about the Net. When it comes to the quality of our thoughts and judgments, the amount of information a communication medium supplies is less important than the way the medium presents the information and the way, in turn, our minds take it in. The brain’s capacity is not unlimited. The passageway from perception to understanding is narrow. It takes patience and concentration to evaluate new information—to gauge its accuracy, to weigh its relevance and worth, to put it into context—and the Internet, by design, subverts patience and concentration. When the brain is overloaded by stimuli, as it usually is when we’re peering into a network-connected computer screen, attention splinters, thinking becomes superficial, and memory suffers. We become less reflective and more impulsive. Far from enhancing human intelligence, I argue, the Internet degrades it.
Much has changed in the decade since The Shallows came out. Smartphones have become our constant companions. Social media has insinuated itself into everything we do. The dark things that can happen when everyone’s connected have happened. Our faith in Silicon Valley has been broken, yet the big Internet companies wield more power than ever. This tenth-anniversary edition of The Shallows takes stock of the changes. It includes an extensive new afterword in which I examine the cognitive and cultural consequences of the rise of smartphones and social media, drawing on the large body of new research that has appeared since 2010. I have left the original text of the book largely unchanged. I’m biased, but I think The Shallows has aged well. To my eyes, it’s more relevant today than it was ten years ago. I hope you find it worthy of your attention.
—NICHOLAS CARR, MASSACHUSETTS, 2020
And in the midst of this wide quietness
A rosy sanctuary will I dress
With the wreath’d trellis of a working brain . . .
— JOHN KEATS, “Ode to Psyche”
In 1964, just as the Beatles were launching their invasion of America’s airwaves, Marshall McLuhan published Understanding Media: The Extensions of Man and transformed himself from an obscure academic into a star. Oracular, gnomic, and mindbending, the book was a perfect product of the sixties, that nowdistant decade of acid trips and moon shots, inner and outer voyaging. Understanding Media was at heart a prophecy, and what it prophesied was the dissolution of the linear mind. McLuhan declared that the “electric media” of the twentieth century—telephone, radio, movies, television—were breaking the tyranny of text over our thoughts and senses. Our isolated, fragmented selves, locked for centuries in the private reading of printed pages, were becoming whole again, merging into the global equivalent of a tribal village. We were approaching “the technological simulation of consciousness, when the creative process of knowing will be collectively and corporately extended to the whole of human society.”1
Even at the crest of its fame, Understanding Media was a book more talked about than read. Today it has become a cultural relic, consigned to media studies courses in universities. But McLuhan, as much a showman as a scholar, was a master at turning phrases, and one of them, sprung from the pages of the book, lives on as a popular saying: “The medium is the message.” What’s been forgotten in our repetition of this enigmatic aphorism is that McLuhan was not just acknowledging, and celebrating, the transformative power of new communication technologies. He was also sounding a warning about the threat the power poses—and the risk of being oblivious to that threat. “The electric technology is within the gates,” he wrote, “and we are numb, deaf, blind and mute about its encounter with the Gutenberg technology, on and through which the American way of life was formed.”2
McLuhan understood that whenever a new medium comes along, people naturally get caught up in the information—the “content”—it carries. They care about the news in the newspaper, the music on the radio, the shows on the TV, the words spoken by the person on the far end of the phone line. The technology of the medium, however astonishing it may be, disappears behind whatever flows through it—facts, entertainment, instruction, conversation. When people start debating (as they always do) whether the medium’s effects are good or bad, it’s the content they wrestle over. Enthusiasts celebrate it; skeptics decry it. The terms of the argument have been pretty much the same for every new informational medium, going back at least to the books that came off Gutenberg’s press. Enthusiasts, with good reason, praise the torrent of new content that the technology uncorks, seeing it as signaling a “democratization” of culture. Skeptics, with equally good reason, condemn the crassness of the content, viewing it as signaling a “dumbing down” of culture. One side’s abundant Eden is the other’s vast wasteland.
The Internet is the latest medium to spur this debate. The clash between Net enthusiasts and Net skeptics, carried out over the last two decades through dozens of books and articles and thousands of blog posts, video clips, and podcasts, has become as polarized as ever, with the former heralding a new golden age of access and participation and the latter bemoaning a new dark age of mediocrity and narcissism. The debate has been important—content does matter—but because it hinges on personal ideology and taste, it has gone down a cul-de-sac. The views have become extreme, the attacks personal. “Luddite!” sneers the enthusiast. “Philistine!” scoffs the skeptic. “Cassandra!” “Pollyanna!”
What both enthusiast and skeptic miss is what McLuhan saw: that in the long run a medium’s content matters less than the medium itself in influencing how we think and act. As our window onto the world, and onto ourselves, a popular medium molds what we see and how we see it—and eventually, if we use it enough, it changes who we are, as individuals and as a society. “The effects of technology do not occur at the level of opinions or concepts,” wrote McLuhan. Rather, they alter “patterns of perception steadily and without any resistance.”3 The showman exaggerates to make his point, but the point stands. Media work their magic, or their mischief, on the nervous system itself.
Our focus on a medium’s content can blind us to these deep effects. We’re too busy being dazzled or disturbed by the programming to notice what’s going on inside our heads. In the end, we come to pretend that the technology itself doesn’t matter. It’s how we use it that matters, we tell ourselves. The implication, comforting in its hubris, is that we’re in control. The technology is just a tool, inert until we pick it up and inert again once we set it aside.
McLuhan quoted a self-serving pronouncement by David Sarnoff, the media mogul who pioneered radio at RCA and television at NBC. In a speech at the University of Notre Dame in 1955, Sarnoff dismissed criticism of the mass media on which he had built his empire and his fortune. He turned the blame for any ill effects away from the technologies and onto the listeners and viewers: “We are too prone to make technological instruments the scapegoats for the sins of those who wield them. The products of modern science are not in themselves good or bad; it is the way they are used that determines their value.” McLuhan scoffed at the idea, chiding Sarnoff for speaking with “the voice of the current somnambulism.”4Every new medium, McLuhan understood, changes us. “Our conventional response to all media, namely that it is how they are used that counts, is the numb stance of the technological idiot,” he wrote. The content of a medium is just “the juicy piece of meat carried by the burglar to distract the watchdog of the mind.”5
Not even McLuhan could have foreseen the feast that the Internet has laid before us: one course after another, each juicier than the last, with hardly a moment to catch our breath between bites. As networked computers have shrunk to the size of iPhones and Androids, the feast has become a movable one, available anytime, anywhere. It’s in our home, our office, our car, our classroom, our purse, our pocket. Even people who are wary of the Net’s ever-expanding influence rarely allow their concerns to get in the way of their use and enjoyment of the technology. The movie critic David Thomson once observed that “doubts can be rendered feeble in the face of the certainty of the medium.”6 He was talking about the cinema and how it projects its sensations and sensibilities not only onto the movie screen but onto us, the engrossed and compliant audience. His comment applies with even greater force to the Net. The computer screen bulldozes our doubts with its bounties and conveniences. It is so much our servant that it would seem churlish to notice that it is also our master.
“Dave, stop. Stop, will you? Stop, Dave. Will you stop?” So the supercomputer HAL pleads with the implacable astronaut Dave Bowman in a famous and weirdly poignant scene toward the end of Stanley Kubrick’s 2001: A Space Odyssey. Bowman, having nearly been sent to a deep-space death by the malfunctioning machine, is calmly, coldly disconnecting the memory circuits that control its artificial brain. “Dave, my mind is going,” HAL says, forlornly. “I can feel it. I can feel it.”
I can feel it too. Over the last few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I feel it most strongly when I’m reading. I used to find it easy to immerse myself in a book or a lengthy article. My mind would get caught up in the twists of the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration starts to drift after a page or two. I get fidgety, lose the thread, begin looking for something else to do. I feel like I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
I think I know what’s going on. For well over a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web’s been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or the pithy quote I was after. I couldn’t begin to tally the hours or the gallons of gasoline the Net has saved me. I do most of my banking and a lot of my shopping online. I use my browser to pay my bills, schedule my appointments, book flights and hotel rooms, renew my driver’s license, send invitations and greeting cards. Even when I’m not working, I’m as likely as not to be foraging in the Web’s data thickets—reading and writing e-mails, scanning headlines and blog posts, following Facebook updates, watching video streams, downloading music, or just tripping lightly from link to link to link.
The Net has become my all-purpose medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich and easily searched store of data are many, and they’ve been widely described and duly applauded. “Google,” says Heather Pringle, a writer with Archaeology magazine, “is an astonishing boon to humanity, gathering up and concentrating information and ideas that were once scattered so broadly around the world that hardly anyone could profit from them.”1 Observes Wired’s Clive Thompson, “The perfect recall of silicon memory can be an enormous boon to thinking.”2
The boons are real. But they come at a price. As McLuhan suggested, media aren’t just channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. Whether I’m online or not, my mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.
Maybe I’m an aberration, an outlier. But it doesn’t seem that way. When I mention my troubles with reading to friends, many say they’re suffering from similar afflictions. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some worry they’re becoming chronic scatterbrains. Several of the bloggers I follow have also mentioned the phenomenon. Scott Karp, who used to work for a magazine and now writes a blog about online media, confesses that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he writes. “What happened?” He speculates on the answer: “What if I do all my reading on the web not so much because the way I read has changed, i.e. I’m just seeking convenience, but because the way I THINK has changed?”3
Bruce Friedman, who blogs about the use of computers in medicine, has also described how the Internet is altering his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he says.4 A pathologist on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online. “I can’t read War and Peace anymore,” he admitted. “I’ve lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it.”
Philip Davis, a doctoral student in communication at Cornell who contributes to the Society for Scholarly Publishing’s blog, recalls a time back in the 1990s when he showed a friend how to use a Web browser. He says he was “astonished” and “even irritated” when the woman paused to read the text on the sites she stumbled upon. “You’re not supposed to read web pages, just click on the hypertexted words!” he scolded her. Now, Davis writes, “I read a lot—or at least I should be reading a lot—only I don’t. I skim. I scroll. I have very little patience for long, drawn-out, nuanced arguments, even though I accuse others of painting the world too simply.”5
Karp, Friedman, and Davis—all well-educated men with a keenness for writing—seem fairly sanguine about the decay of their faculties for reading and concentrating. All things considered, they say, the benefits they get from using the Net—quick access to loads of information, potent searching and filtering tools, an easy way to share their opinions with a small but interested audience—make up for the loss of their ability to sit still and turn the pages of a book or a magazine. Friedman told me, in an e-mail, that he’s “never been more creative” than he has been recently, and he attributes that “to my blog and the ability to review/scan ‘tons’ of information on the web.” Karp has come to believe that reading lots of short, linked snippets online is a more efficient way to expand his mind than reading “250-page books,” though, he says, “we can’t yet recognize the superiority of this networked thinking process because we’re measuring it against our old linear thought process.”6 Muses Davis, “The Internet may have made me a less patient reader, but I think that in many ways, it has made me smarter. More connections to documents, artifacts, and people means more external influences on my thinking and thus on my writing.”7 All three know they’ve sacrificed something important, but they wouldn’t go back to the way things used to be.
For some people, the very idea of reading a book has come to seem old-fashioned, maybe even a little silly—like sewing your own shirts or butchering your own meat. “I don’t read books,” says Joe O’Shea, a former president of the student body at Florida State University and a 2008 recipient of a Rhodes Scholarship. “I go to Google, and I can absorb relevant information quickly.” O’Shea, a philosophy major, doesn’t see any reason to plow through chapters of text when it takes but a minute or two to cherry-pick the pertinent passages using Google Book Search. “Sitting down and going through a book from cover to cover doesn’t make sense,” he says. “It’s not a good use of my time, as I can get all the information I need faster through the Web.” As soon as you learn to be “a skilled hunter” online, he argues, books become superfluous.8
O’Shea seems more the rule than the exception. In 2008, a research and consulting outfit called nGenera released a study of the effects of Internet use on the young. The company interviewed some six thousand members of what it calls “Generation Net”—kids who have grown up using the Web. “Digital immersion,” wrote the lead researcher, “has even affected the way they absorb information. They don’t necessarily read a page from left to right and from top to bottom. They might instead skip around, scanning for pertinent information of interest.”9 In a talk at a recent Phi Beta Kappa meeting, Duke University professor Katherine Hayles confessed, “I can’t get my students to read whole books anymore.”10 Hayles teaches English; the students she’s talking about are students of literature.
People use the Internet in all sorts of ways. Some are eager, even compulsive adopters of the latest technologies. They keep accounts with a dozen or more online services and subscribe to scores of information feeds. They post and they comment, they text and they tweet. Others don’t much care about being on the cutting edge but nevertheless find themselves online most of the time, tapping away at their desktop, their laptop, or their phone. The Net has become essential to their work, school, or social lives, and often to all three. Still others log on only a few times a day—to check their e-mail, follow a story in the news, research a topic of interest, or do some shopping. And there are, of course, many people who don’t use the Internet at all, either because they can’t afford to or because they don’t want to. What’s clear, though, is that for society as a whole the Net has become, in just the twenty years since the software programmer Tim Berners-Lee wrote the code for the World Wide Web, the communication and information medium of choice. The scope of its use is unprecedented, even by the standards of the mass media of the twentieth century. The scope of its influence is equally broad. By choice or necessity, we’ve embraced the Net’s uniquely rapid-fire mode of collecting and dispensing information.
We seem to have arrived, as McLuhan said we would, at an important juncture in our intellectual and cultural history, a moment of transition between two very different modes of thinking. What we’re trading away in return for the riches of the Net—and only a curmudgeon would refuse to see the riches—is what Karp calls “our old linear thought process.” Calm, focused, undistracted, the linear mind is being pushed aside by a new kind of mind that wants and needs to take in and dole out information in short, disjointed, often overlapping bursts—the faster, the better. John Battelle, a onetime magazine editor and journalism professor who now runs an online advertising syndicate, has described the intellectual frisson he experiences when skittering across Web pages: “When I am performing bricolage in real time over the course of hours, I am ‘feeling’ my brain light up, I [am] ‘feeling’ like I’m getting smarter.”11 Most of us have experienced similar sensations while online. The feelings are intoxicating—so much so that they can distract us from the Net’s deeper cognitive consequences.
For the last five centuries, ever since Gutenberg’s printing press made book reading a popular pursuit, the linear, literary mind has been at the center of art, science, and society. As supple as it is subtle, it’s been the imaginative mind of the Renaissance, the rational mind of the Enlightenment, the inventive mind of the Industrial Revolution, even the subversive mind of Modernism. It may soon be yesterday’s mind.
THE HAL 9000 computer was born, or “made operational,” as HAL himself humbly put it, on January 12, 1992, in a mythical computer plant in Urbana, Illinois. I was born almost exactly thirty-three years earlier, in January of 1959, in another midwestern city, Cincinnati, Ohio. My life, like the lives of most Baby Boomers and Generation Xers, has unfolded like a two-act play. It opened with Analogue Youth and then, after a quick but thorough shuffling of the props, it entered Digital Adulthood.
When I summon up images from my early years, they seem at once comforting and alien, like stills from a G-rated David Lynch film. There’s the bulky mustard-yellow telephone affixed to the wall of our kitchen, with its rotary dial and long, coiled cord. There’s my dad fiddling with the rabbit ears on top of the TV, vainly trying to get rid of the snow obscuring the Reds game. There’s the rolled-up, dewdampened morning newspaper lying in our gravel driveway. There’s the hi-fi console in the living room, a few record jackets and dust sleeves (some from my older siblings’ Beatles albums) scattered on the carpet around it. And downstairs, in the musty basement family room, there are the books on the bookshelves—lots of books—with their many-colored spines, each bearing a title and the name of a writer.
In 1977, the year Star Wars came out and the Apple Computer company was incorporated, I headed to New Hampshire to attend Dartmouth College. I didn’t know it when I applied, but Dartmouth had long been a leader in academic computing, playing a pivotal role in making the power of data-processing machines easily available to students and teachers. The college’s president, John Kemeny, was a respected computer scientist who in 1972 had written an influential book called Man and the Computer. He had also, a decade before that, been one the inventors of BASIC, the first programming language to use common words and everyday syntax. Near the center of the school’s grounds, just behind the neo-Georgian Baker Library with its soaring bell tower, squatted the single-story Kiewit Computation Center, a drab, vaguely futuristic concrete building that housed the school’s pair of General Electric GE-635 mainframe computers. The mainframes ran the groundbreaking Dartmouth Time-Sharing System, an early type of network that allowed dozens of people to use the computers simultaneously. Time-sharing was the first manifestation of what we today call personal computing. It made possible, as Kemeny wrote in his book, “a true symbiotic relationship between man and computer.”12
I was an English major and went to great lengths to avoid math and science classes, but Kiewit occupied a strategic location on campus, midway between my dorm and Fraternity Row, and on weekend evenings I’d often spend an hour or two at a terminal in the public teletype room while waiting for the keg parties to get rolling. Usually, I’d fritter away the time playing one of the goofily primitive multiplayer games that the undergraduate programmers—“sysprogs,” they called themselves—had hacked together. But I did manage to teach myself how to use the system’s cumbersome word-processing program and even learned a few BASIC commands.
That was just a digital dalliance. For every hour I passed in Kiewit, I must have spent two dozen next door in Baker. I crammed for exams in the library’s cavernous reading room, looked up facts in the weighty volumes on the reference shelves, and worked part-time checking books in and out at the circulation desk. Most of my library time, though, went to wandering the long, narrow corridors of the stacks. Despite being surrounded by tens of thousands of books, I don’t remember feeling the anxiety that’s symptomatic of what we today call “information overload.” There was something calming in the reticence of all those books, their willingness to wait years, decades even, for the right reader to come along and pull them from their appointed slots. Take your time, the books whispered to me in their dusty voices. We’re not going anywhere.
It was in 1986, five years after I left Dartmouth, that computers entered my life in earnest. To my wife’s dismay, I spent nearly our entire savings, some $2,000, on one of Apple’s earliest Macintoshes—a Mac Plus decked out with a single megabyte of RAM, a 20-megabyte hard drive, and a tiny black-and-white screen. I still recall the excitement I felt as I unpacked the little beige machine. I set it on my desk, plugged in the keyboard and mouse, and flipped the power switch. It lit up, sounded a welcoming chime, and smiled at me as it went through the mysterious routines that brought it to life. I was smitten.
The Plus did double duty as both a home and a business computer. Every day, I lugged it into the offices of the management consulting firm where I worked as an editor. I used Microsoft Word to revise proposals, reports, and presentations, and sometimes I’d launch Excel to key in revisions to a consultant’s spreadsheet. Every evening, I carted it back home, where I used it to keep track of the family finances, write letters, play games (still goofy, but less primitive), and—most diverting of all—cobble together simple databases using the ingenious HyperCard application that back then came with every Mac. Created by Bill Atkinson, one of Apple’s most inventive programmers, HyperCard incorporated a hypertext system that anticipated the look and feel of the World Wide Web. Where on the Web you click links on pages, on HyperCard you clicked buttons on cards—but the idea, and its seductiveness, was the same.
The computer, I began to sense, was more than just a simple tool that did what you told it to do. It was a machine that, in subtle but unmistakable ways, exerted an influence over you. The more I used it, the more it altered the way I worked. At first I had found it impossible to edit anything on-screen. I’d print out a document, mark it up with a pencil, and type the revisions back into the digital version. Then I’d print it out again and take another pass with the pencil. Sometimes I’d go through the cycle a dozen times a day. But at some point—and abruptly—my editing routine changed. I found I could no longer write or revise anything on paper. I felt lost without the Delete key, the scrollbar, the cut and paste functions, the Undo command. I had to do all my editing on-screen. In using the word processor, I had become something of a word processor myself.
Bigger changes came after I bought a modem, sometime around 1990. Up to then, the Plus had been a self-contained machine, its functions limited to whatever software I installed on its hard drive. When hooked up to other computers through the modem, it took on a new identity and a new role. It was no longer just a high-tech Swiss Army knife. It was a communications medium, a device for finding, organizing, and sharing information. I tried all the online services— CompuServe, Prodigy, even Apple’s short-lived eWorld—but the one I stuck with was America Online. My original AOL subscription limited me to five hours online a week, and I would painstakingly parcel out the precious minutes to exchange e-mails with a small group of friends who also had AOL accounts, to follow the conversations on a few bulletin boards, and to read articles reprinted from newspapers and magazines. I actually grew fond of the sound of my modem connecting through the phone lines to the AOL servers. Listening to the bleeps and clangs was like overhearing a friendly argument between a couple of robots.
By the mid-nineties, I had become trapped, not unhappily, in the “upgrade cycle.” I retired the aging Plus in 1994, replacing it with a Macintosh Performa 550 with a color screen, a CD-ROM drive, a 500-megabyte hard drive, and what seemed at the time a miraculously fast 33-megahertz processor. The new computer required updated versions of most of the programs I used, and it let me run all sorts of new applications with the latest multimedia features. By the time I had installed all the new software, my hard drive was full. I had to go out and buy an external drive as a supplement. I added a Zip drive too—and then a CD burner. Within a couple of years, I’d bought another new desktop, with a much larger monitor and a much faster chip, as well as a portable model that I could use while traveling. My employer had, in the meantime, banished Macs in favor of Windows PCs, so I was using two different systems, one at work and one at home.
It was around this same time that I started hearing talk of something called the Internet, a mysterious “network of networks” that promised, according to people in the know, to “change everything.” A 1994 article in Wired declared my beloved AOL “suddenly obsolete.” A new invention, the “graphical browser,” promised a far more exciting digital experience: “By following the links—click, and the linked document appears—you can travel through the online world along paths of whim and intuition.”13 I was intrigued, and then I was hooked. By the end of 1995 I had installed the new Netscape browser on my work computer and was using it to explore the seemingly infinite pages of the World Wide Web. Soon I had an ISP account at home as well—and a much faster modem to go with it. I canceled my AOL service.
You know the rest of the story because it’s probably your story too. Ever-faster chips. Ever-quicker modems. DVDs and DVD burners. Gigabyte-sized hard drives. Yahoo and Amazon and eBay. MP3s. Streaming video. Broadband. Napster and Google. BlackBerrys and iPods. Wi-Fi networks. YouTube and Wikipedia. Blogging and microblogging. Smartphones, thumb drives, netbooks. Who could resist? Certainly not I.
When the Web went 2.0 around 2005, I went 2.0 with it. I became a social networker and a content generator. I registered a domain, roughtype.com, and launched a blog. It was exhilarating, at least for the first couple of years. I had been working as a freelance writer since the start of the decade, writing mainly about technology, and I knew that publishing an article or a book was a slow, involved, and often frustrating business. You slaved over a manuscript, sent it off to a publisher, and, assuming it wasn’t sent back with a rejection slip, went through rounds of editing, fact checking, and proofreading. The finished product wouldn’t appear until weeks or months later. If it was a book, you might have to wait more than a year to see it in print. Blogging junked the traditional publishing apparatus. You’d type something up, code a few links, hit the Publish button, and your work would be out there, immediately, for all the world to see. You’d also get something you rarely got with more formal writing: direct responses from readers, in the form of comments or, if the readers had their own blogs, links. It felt new and liberating.
Reading online felt new and liberating too. Hyperlinks and search engines delivered an endless supply of words to my screen, alongside pictures, sounds, and videos. As publishers tore down their paywalls, the flood of free content turned into a tidal wave. Headlines streamed around the clock through my Yahoo home page and my RSS feed reader. One click on a link led to a dozen or a hundred more. New e-mails popped into my in-box every minute or two. I registered for accounts with MySpace and Facebook, Digg and Twitter. I started letting my newspaper and magazine subscriptions lapse. Who needed them? By the time the print editions arrived, dewdampened or otherwise, I felt like I’d already seen all the stories.
Sometime in 2007, a serpent of doubt slithered into my infoparadise. I began to notice that the Net was exerting a much stronger and broader influence over me than my old stand-alone PC ever had. It wasn’t just that I was spending so much time staring into a computer screen. It wasn’t just that so many of my habits and routines were changing as I became more accustomed to and dependent on the sites and services of the Net. The very way my brain worked seemed to be changing. It was then that I began worrying about my inability to pay attention to one thing for more than a couple of minutes. At first I’d figured that the problem was a symptom of middle-age mind rot. But my brain, I realized, wasn’t just drifting. It was hungry. It was demanding to be fed the way the Net fed it—and the more it was fed, the hungrier it became. Even when I was away from my computer, I yearned to check e-mail, click links, do some Googling. I wanted to be connected. Just as Microsoft Word had turned me into a flesh-and-blood word processor, the Internet, I sensed, was turning me into something like a high-speed data-processing machine, a human HAL.
I missed my old brain.