Erhalten Sie Zugang zu diesem und mehr als 300000 Büchern ab EUR 5,99 monatlich.
'A work of sheer brilliance, beauty and bravery' Andrew Sean Greer, author of Less 'Masterly... Her essays have a clarity and prescience that imply a sort of distant, retrospective view, like postcards sent from the near future' New York Times We stare at our phones. We keep multiple tabs open. Our chats and conversations are full of the phrase "Did you see?" The feeling that we're living in the worst of times seems to be intensifying, alongside a desire to know precisely how bad things have gotten. Poet and essayist Elisa Gabbert's The Unreality of Memory consists of a series of lyrical and deeply researched meditations on what our culture of catastrophe has done to public discourse and our own inner lives. In these tender and prophetic essays, she focuses in on our daily preoccupation and favorite pasttime: desperate distraction from disaster by way of a desperate obsession with the disastrous. Moving from public trauma to personal tragedy, from the Titanic and Chernobyl to illness and loss, The Unreality of Memory alternately rips away the facade of our fascination with destruction and gently identifies itself with the age of rubbernecking. A balm, not a burr, Gabbert's essays are a hauntingly perceptive analysis of the anxiety intrinsic in our new, digital ways of being, and also a means of reconciling ourselves to this new world. 'One of those joyful books that send you to your notebook every page or so, desperate not to lose either the thought the author has deftly placed in your mind or the title of a work she has now compelled you to read.' Paris Review
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 310
Veröffentlichungsjahr: 2020
Das E-Book (TTS) können Sie hören im Abo „Legimi Premium” in Legimi-Apps auf:
ALSO BY ELISA GABBERT
ESSAYS
The Word Pretty
POETRY
L’Heure Bleue, or The Judy Poems
The Self Unstable
The French Exit
First published in paperback in the United States of America in 2020 by FSG Originals, Farrar, Straus and Giroux, 120 Broadway, New York 10271.
First published in hardback in Great Britain in 2020 by Atlantic
Books, an imprint of Atlantic Books Ltd.
Copyright © Elisa Gabbert, 2020
The moral right of Elisa Gabbert to be identified as the author of this work has been asserted by her in accordance with the Copyright, Designs and Patents Act of 1988.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of both the copyright owner and the above publisher of this book.
The picture acknowledgements on p.259 constitute an extension of this copyright page.
Every effort has been made to trace or contact all copyright holders. The publishers will be pleased to make good any omissions or rectify any mistakes brought to their attention at the earliest opportunity.
1 2 3 4 5 6 7 8 9
A CIP catalogue record for this book is available from the British Library.
Hardback ISBN: 978 1 83895 062 0
EBook ISBN: 978 1 83895 063 7
Printed in Great Britain
Atlantic Books
An Imprint of Atlantic Books Ltd
Ormond House
26–27 Boswell Street
London
WC1N 3JZ
www.atlantic-books.co.uk
For J
Let the disaster speak in you, even if it be by your forgetfulness or silence.
—MAURICE BLANCHOT, The Writing of the Disaster
With the inflation of apocalyptic rhetoric has come the increasing unreality of the apocalypse. A permanent modern scenario: apocalypse looms . . . and it doesn’t occur. And it still looms.
—SUSAN SONTAG, AIDS and Its Metaphors
PART ONE
MAGNIFICENT DESOLATION
DOOMSDAY PATTERN
THREATS
BIG AND SLOW
THE GREAT MORTALITY
PART TWO
THE LITTLE ROOM (OR, THE UNREALITY OF MEMORY)
VANITY PROJECT
WITCHES AND WHIPLASH
SLEEP NO MORE
PART THREE
TRUE CRIME
I’M SO TIRED
IN OUR MIDST
EPILOGUE: THE UNREALITY OF TIME
SELECTED BIBLIOGRAPHY
ACKNOWLEDGMENTS
A couple of years ago, distracting myself at work, I saw a link on Twitter to a YouTube video that caught my attention. It was a computer-animated re-creation of the sinking of the Titanic in real time, all two hours and forty minutes of it. I did not watch the whole video, but I skipped around and watched parts, interested especially in the few interior views, where you can watch the water level slowly rising at an angle—since the ship pitched forward as it sank—in the white-painted hallways of the lower decks, and later, in the ballroom and grand staircase, as wicker chairs bob around.
The strangest thing about the video is that it includes no people—no cartoon passengers. There is no violin music, no voice-over. The ship is lit up, glowing yellow in the night, but the only sound, apart from a few emergency flares and engine explosions, is of water sloshing into and against the ship. The overall impression is of near silence. It’s almost soothing.
This is true until the last few minutes of the video, when the half-submerged ship begins to groan and finally cracks in half. Only then, as the lights go out and the steam funnels collapse, do you hear the sound of people screaming, which continues for another thirty seconds after the ship has disappeared. A caption on the screen reads: “2:20—Titanic is gone. Rescue does not arrive for another hour and forty minutes.” A few lifeboats (empty) are seen floating on the calm black ocean, under a starry sky. Then, another caption: “2:21—Titanic is heard beneath the surface breaking apart and imploding as it falls to the sea floor.” The video ends on this disturbing note, with no framing narrative to lend a pseudo-happy ending.
At once, I was obsessed with the story of the Titanic. I rewatched the James Cameron movie (which I first saw in high school—still ridiculous, still gripping); I read a Beryl Bainbridge novel (Every Man for Himself) based on the night of the sinking, which felt like a novelization of the Cameron movie, though the book predates it, just; I read thousands of words on Wikipedia and what you might call fan sites, if you can be a fan of a disaster—lists of “facts” and conspiracy theories. I watched a documentary (Titanic’s Final Mystery) about a weird new theory of the root cause of the disaster: One scientist thinks that a sudden and extreme drop in temperature caused a mirage on the horizon that obscured the iceberg from the men in the lookout until they were nearly upon it. The same illusion could explain why a nearby ship, the SS Californian, did not see that the Titanic was clearly in distress. It is, of course, just a theory.
The Hollywood version of the narrative, which puts the blame on hubris, has a lot of pull—the Titanic sank because they dared to call it unsinkable. It’s the Icarus interpretation: Blinded by a foolhardy overconfidence, we flew too close to the sun, melting our wings, and so on. It’s the easiest explanation, appealing in its simplicity, its mythic aura.
When I ran out of freely available Titanic material, I moved on to other disasters. I had an overwhelming desire for disaster stories, of a particular flavor: I wanted stories about great technological feats meeting their untimely doom. I felt addicted to disbelief—to the catharsis of reality denying my expectations, or verifying my worst fears, in spectacular fashion. The obvious next stop was 9/11.
So far, 9/11 is the singular disaster of my lifetime. People who were in New York City at the time always comment on how “beautiful” and “perfect” that September morning was, with “infinite visibility”—pilots call those conditions “severe clear.” As I recall, it was a bright blue day in Houston too. I was driving from my apartment to the Rice University campus a couple of miles away when I heard radio reports of a plane hitting one of the Twin Towers. I continued driving to school, parked my car in the stadium lot, and went into the student center, where a few people were watching the news on TV with that air of disbelief that can appear almost casual.
The live footage of a massive steel skyscraper with smoke pluming from a hole in its side was shocking, but I felt it dully—shock in the form of incomprehension, maybe denial. I don’t remember truly feeling horror—that is, understanding—until people began to jump from the buildings. They were specks against the scale of the towers, filmed from a distance, but you knew what they were. They became known as the “jumpers”: people trapped in the upper floors of the buildings, above the planes’ impact and unable to get out, who were driven to such desperation from the extreme heat and lack of oxygen that they broke the thick windows with office furniture and jumped to the pavement hundreds of stories below. Leslie E. Robertson, the lead structural engineer of the towers, later wrote that “the temperatures above the impact zones must have been unimaginable.” The people nearby, and still in the buildings, could hear the bodies landing.
An Associated Press photo dubbed “The Falling Man” captures one of these jumpers: a man “falling,” as if at ease, upside down and in parallel with the vertical grid of the tower. (It’s a trick of photography; other photos in the series show him tumbling haphazardly, out of control.) The photo was widely publicized at first but then met with vehement critique. Some people found this particular image too much to take, an insult to their senses. And though the jumps were witnessed by many, the New York City medical examiner’s office classifies all deaths from the 9/11 attacks as homicides. Of course, the deaths were forced, forced by suffering—but they were also voluntary. It seems akin to prisoners held in solitary confinement (or otherwise tortured) killing themselves—murder by suicide.
When I think of the jumpers, I think of two things. I think of images of women covering their mouths—a pure expression of horror. They were caught on film, watching the towers from the streets of Manhattan. I do this sometimes—hand up, mouth open—when I see or read something horrible, even when alone. What is it for? I think, too, of the documentary about Philippe Petit, who tightrope-walked between the tops of the towers in 1974. At the time, they were the second-tallest buildings in the world, having just been surpassed by the Sears Tower in Chicago. It was an exceptionally windy day (it’s always windy at 1,300 feet) and when a policeman threatened him from the roof of one building, Petit danced and pranced along the rope, to taunt him. This feels like one of the craziest things a man has ever done. For the jumpers, death was not a risk but a certainty; they jumped without thinking. It’s more horrible to contemplate than many of the other deaths because we know the jumpers were tortured. Death is more fathomable than torture.
A Discovery Channel documentary that I found on YouTube called Inside the Twin Towers provides a minute-by-minute account of the events on September 11, a mix of reenactments and interviews with survivors. One man who managed to escape from the North Tower—he was four floors below the impact—recounts a moment when he opened a door and saw “the deepest, the richest black” he had ever seen. He called into it. Instead of continuing down the hall to see if anyone was there, he retreated back to his office in fear. He says in the film, “If I had gone down the hallway and died, it would have been better than living with this knowledge of ‘Hey, you know what, when it came right down to it, I was a coward.’ And it was actually our two coworkers down that hallway, on the other side, that ended up dying on that day. And I often think now, Perhaps I should have continued down that hallway.”
This is a classic case of survivor’s guilt, sometimes known as concentration-camp syndrome: the sense that your survival is a moral error. Theodor Adorno, in an amendment to his somewhat misunderstood line about poetry after Auschwitz, wrote:
Perennial suffering has as much right to expression as a tortured man has to scream; hence it may have been wrong to say that after Auschwitz you could no longer write poems. But it is not wrong to raise the less cultural question whether after Auschwitz you can go on living—especially whether one who escaped by accident, one who by rights should have been killed, may go on living. His mere survival calls for the coldness, the basic principle of bourgeois subjectivity, without which there could have been no Auschwitz; this is the drastic guilt of him who was spared. By way of atonement he will be plagued by dreams such as that he is no longer living at all.
This syndrome, along with post-traumatic stress disorder, goes some way toward explaining why so many Holocaust survivors have committed suicide.
_______
There is survivor’s guilt, but there is also survivor’s elation, survivor’s thrill—a thrill felt only by those a little farther from disaster. The September 24, 2001, issue of The New Yorker included a symposium of responses to the attacks. A few were able to acknowledge the element of thrill in observation. Jonathan Franzen wrote:
Unless you were a very good person indeed, you were probably, like me, experiencing the collision of several incompatible worlds inside your head. Besides the horror and sadness of what you were watching, you might also have felt a childish disappointment over the disruption of your day, or a selfish worry about the impact on your finances, or admiration for an attack so brilliantly conceived and so flawlessly executed, or, worst of all, an awed appreciation of the visual spectacle it produced.
I find Franzen’s moral hierarchy here questionable, that “worst of all” most puzzling. Because to me, more than worry, or admiration (!), the most natural and undeniable of reactions would seem to be awe.
It’s the spectacle, I think, that makes a disaster a disaster. A disaster is not defined simply by damage or death count; deaths by smoking or car wrecks are not a disaster because they are meted out, predictable. A disaster must not only blindside us, but be witnessed, and re-witnessed, in public. The Challenger explosion killed only seven people, but like the Titanic, which killed more than 1,500, and like 9/11, which killed almost 3,000, the deaths were both highly publicized and completely unexpected. Disasters are news because they are news.
All three of these incidents forced people to watch huge man-made objects, monuments of engineering, fail catastrophically, being torn apart or exploding in the sky. These are events we rarely see except in movies. The destruction of the Challenger and the World Trade Center are now movies themselves, clips we can watch again and again. The ubiquity of cameras, which we now carry all the time in our pockets, makes disaster easier to witness and to reproduce; it may even create a kind of cultural demand for disasters. We also get to watch the reaction shots—both the special effects and the human drama.
Roger Angell’s version of survivor’s thrill in the same New Yorker issue is less chastising:
When the second tower came down, you cried out once again, seeing it on the tube at home, and hurried out onto the street to watch the writhing fresh cloud lift above the buildings to the south, down at the bottom of this amazing and untouchable city, but you were not surprised, even amid such shock, by what you found in yourself next and saw in the faces around you—a bump of excitement, a secret momentary glow. Something is happening and I’m still here.
Angell is saying this is not an aberration; it is the norm. It is one of the terrible parts of disaster, our complicity: the way we glamorize it and make it consumable; the way the news turns disasters into ready-made cinema; the way war movies, which mean to critique war, can really only glorify war.
We don’t talk about it now, but I always found the Twin Towers hideously ugly, in a way not explainable by their shape alone—they were long rectangular prisms, nothing more. Their basic boxiness was somehow an affront. I find the Empire State Building and the Chrysler Building beautiful. I find the Eiffel Tower beautiful. It must be their tapering sweep, the way they diminish as they ascend, their detail suggesting fragility. How could anyone ever have found the Twin Towers beautiful? They seemed designed only to represent sturdiness, like campus buildings in the brutalist tradition that were said to be riot-proof.
A friend, a New Yorker, disagrees. She tells me the buildings “did amazing things with the light.” Another, also from New York, says they were “sexy at night.” But all skyscrapers are sexy at night, from below if not from afar, by virtue of their sheer dizzying size, their sheer sheerness. They stand like massive shears, stabbed into the sky.
Despite their imposing, even ominous height, the towers fell in less than two hours; the Titanic took only a little longer to sink. But that happened gradually. When you watch a building collapse, it seems like it suddenly decides to collapse. It’s a building, and then it’s not a building, just a crumbling mass of debris. There is no transition between cohesion and debris. It is terrifying how quickly an ordered structure dissolves. Where does it all go? Buildings, like anything, are mostly empty space.
_______
In the vocabulary of disaster, the word “debris” is important—from the French debriser, to break down. A cherishable word, it sounds so light and delicate. But the World Trade Center produced hundreds of millions of tons of it. The bits of paper falling around the city led some people to mistake the attack for a parade. In space flight, or even on high-speed jets, tiny bits of foreign object debris (FOD) can cause catastrophe. Space food is coated in gelatin to prevent crumbs, which in a weightless environment could work into vulnerable instruments or a pilot’s eye. Debris on the runway could get sucked into a jet engine and cause it to fail.
The Challenger explosion, like the sinking of the Titanic, is usually chalked up to hubris. But if hubris is overconfidence—“presumption toward the gods”—the explanation is unsatisfying. Engineers at NASA’s Marshall Space Flight Center knew that the O-ring seals, which helped contain hot gases in the rocket boosters, were poorly designed and could fail under certain conditions—conditions that were present on the morning of the launch, which was unusually cold. The O-rings were designated as “Criticality 1,” meaning their failure would have catastrophic results. But the engineers did not take action to ground all shuttle flights until the problem could be fixed. As the very first sentence in the official Report of the Presidential Commission on the Space Shuttle Challenger Accident puts it: “The Space Shuttle’s Solid Rocket Booster problem began with the faulty design of its joint and increased as both NASA and contractor management first failed to recognize it as a problem, then failed to fix it and finally treated it as an acceptable flight risk.” What shocks me most when I read about the space program is the magnitude of the risks. The Challenger exploding on live TV in front of 17 percent of Americans was unthinkable to most of those viewers, but not unthinkable to workers at NASA.
From what I understand, NASA has always embraced risk. In his memoir Spaceman, the astronaut Mike Massimino, who flew on two missions to service and repair the Hubble telescope, recounts the atmosphere at NASA after the space shuttle Columbia broke up on reentry in 2003:
When I walked in I saw Kevin Kregel in the hallway. He was standing there shaking his head. He looked up and saw me. “You know,” he said, “we’re all just playing Russian roulette, and you have to be grateful you weren’t the one who got the bullet.” I immediately thought about the two Columbia missions getting switched in the flight order, how it could have been us coming home that day. He was right. There was this tremendous grief and sadness, this devastated look on the faces of everyone who walked in. We’d lost seven members of our family. But underneath that sadness was a definite, and uncomfortable, sense of relief. That sounds perverse to say, but for some of us it’s the way it was. Space travel is dangerous. People die. It had been seventeen years since Challenger. We lost Apollo 1 on the launch pad nineteen years before that. It was time for something to happen and, like Kevin said, you were grateful that your number hadn’t come up.
The culture of risk at NASA is so great that in place of survivor’s guilt there is only survivor’s relief. But knowing the risks and doing it anyway must require some level of cognitive dissonance. This is apparent when Massimino writes that “like most accidents, Columbia was 100 percent preventable.” This is hindsight bias; only past disasters look 100 percent preventable. The Columbia shuttle broke apart due to damage inflicted on the wing when a large chunk of foam insulation flew into it during launch. This was observed on film, and the ground crew questioned whether it might have caused any damage. However, insulation regularly broke apart during launches and had never caused significant damage before. Further, NASA determined that even if the spacecraft had been damaged, which it had no way of verifying, there was nothing that the flight crew could do about it, so NASA officials didn’t even inform them of the possibility of the problem.
When Columbia came apart during reentry, disintegrating and raining down parts like a meteor shower over Texas and Louisiana, an investigation was launched. At first, no one believed that the foam could have done enough damage to cause the accident. It was “lighter than air.” Massimino writes, “We looked at the shuttle hitting these bits of foam like an eighteen-wheeler hitting a Styrofoam cooler on the highway.” Not until they actually reenacted the event by firing a chunk of foam at five hundred miles per hour toward a salvaged wing and saw the results did they accept it as the cause of the disaster. Anything going that fast has tremendous force. This was not like the failure of the O-ring; the risks of the insulation were not understood. Or, more properly, they were simply not seen—it’s basic, though unintuitive, physics. The same type of accident is 100 percent preventable now only because the disaster happened, triggering a shuttle redesign. When redesigns cost billions of dollars, if it isn’t broke, they don’t and probably can’t fix it.
The concept of hubris lets us off too easy. It allows us to blame past versions of ourselves, past paradigms, for faulty thinking that we’ve since overcome. But these scientists we might scoff at now were incredibly smart and incredibly well prepared. The number of things that didn’t go wrong on all the space missions is astounding. It’s easy to blame people for not thinking of everything, but how could they think of everything? How can we?
Not knowing the unknowable isn’t hubris. There is danger in thinking, “We were dumb then, but we’re smart now.” We were smart then, and we are dumb now—both are true. We do learn from the past, but we can’t learn from disasters we can’t even conceive of. While disasters widen our sense of the scope of the possible, there are limits. We can’t imagine all possible futures. Yet we call this hubris. Perhaps it’s comforting to believe that disasters are the result of some fixable “fatal flaw,” and not an inevitable part of the unfolding of history.
To say there are limits to technological progress—we can’t prepare ourselves completely for the unforeseen—is not to say that progress is impossible, but that progress is tightly coupled with disaster. As the French cultural theorist Paul Virilio famously said, “The invention of the ship was also the invention of the shipwreck.” Not until we experience new forms of disaster can we understand what it is we need to prevent. Overreliance on the explanatory power of hubris is itself a form of hubris, a meta-hubris. And without hubris pushing us, however blinkered, forward, would there be any progress at all? Don’t we need hubris to enable and justify advances in technology? NASA seems to take hubris in stride; they see occasional disaster as the fair cost of spaceflight.
In his “Letter from Birmingham Jail,” Martin Luther King, Jr., warned of “the strangely irrational notion that there is something in the very flow of time that will inevitably cure all ills.” You could say the same of technological progress; it is tempting to believe that progress occurs on a linear curve, that eventually all problems will be solved, and all accidents will be completely preventable. But there’s no reason to assume that the curve of progress is linear, that the climb is ever increasing.
I want to come back to the Titanic, and some common misconceptions. One is that there were not enough lifeboats on board for frivolous reasons—because proprietors felt they would look unattractive on deck, or because they were regarded as mere symbols, serving only to comfort nervous passengers on a ship designers believed was literally unsinkable. This isn’t the case. Rather, the thinking at the time was that the safest method of rescue, in the event of an emergency, was to ferry passengers back and forth between the sinking ship and a rescue ship. Because the Titanic would sink slowly, if at all, people would actually be safer on the ship, for some time, than in a lifeboat. Therefore, the lifeboats didn’t need to accommodate the entire capacity of the ship in one go.
So why did the Titanic sink so fast? The surprising truth is that if the ship had hit the iceberg head-on, instead of narrowly missing it at the stern and then scraping along its side, it would not have sunk. The ship was capable of sustaining major damage from an impact like an iceberg—it could have stayed afloat if four of its sixteen watertight bulkheads were flooded. But the iceberg tore into the ship in such a way that five compartments were damaged. This event was not, realistically, foreseeable; no iceberg in history had done that kind of damage to a ship, and none has done that kind of damage since. It was, in essence, a freak accident.
There are echoes of this in the World Trade Center’s collapse. It’s well-known that the buildings were designed to survive the impact of an airplane. However, the engineers were envisioning emergencies like a small, slow-flying plane hitting one of the towers by accident—in fact, a bomber flying in near-zero visibility had hit the Empire State Building in 1945—not a modern jet being flown purposely into a tower at top speed. Still, there was a false sense of security. After the first impact, the PA system in the building told people to remain at their desks when of course they should have been evacuating. Some building staff also told workers it would be safer to stay where they were.
Is this hubris, or something else? Disasters always feel like a thing of the past. We want to believe that better technology, better engineering will save us. That the more information we have, the safer we can make our technology. But we can never have all the information. In creating new technology to address known problems, we unavoidably create new problems, new unknowns. Progress changes the parameters of possibility. This is something we strive for—to innovate past the event horizon of what we can imagine. And with so much that is inaccessible, opaque, and in flux, we can’t even hold on to what we already know.
As they stepped out of the lunar module and began their moon walk, Neil Armstrong said to Buzz Aldrin, “Isn’t that something! Magnificent sight out there.” Aldrin’s cryptic, poetic response was “Magnificent desolation.” I think of this quote when I see footage of disasters. Especially after years of buffer, years of familiarity, have lessened the sting, it’s easy to see these events as, in their way, magnificent. Magnificent creations beget magnificent failures. It is awesome that we built them; it was awesome when they fell. Horror and awe are not incompatible; they are intertwined.
Is it perversity or courage that allows some people to admit to survivor’s thrill? On the afternoon of September 11, I remember meeting my then boyfriend on campus for lunch. He was a contrarian type, but his reaction still disturbed me—he was visibly giddy, buzzed by the news. It’s not that I don’t believe other people were excited, but no one else had revealed it. In 2005, before the levees broke in New Orleans, a friend of mine asked if I wasn’t just a little bit disappointed that Hurricane Katrina hadn’t turned out as bad as predicted. Just hours later, she regretted saying it.
Often, when something bad happens, I have a strange instinctual desire for things to get even worse—I think of a terrible outcome and then wish for it. I recognize the pattern, but I don’t understand it. It’s as though my mind is running simulations and can’t help but prefer the most dramatic option—as though, in that eventuality, I could enjoy it from the outside. Of course, my rational mind knows better; it knows I don’t want what I want. Still, I fear this part of me, the small but undeniable pull of disaster. It’s something we all must have inside us. Who can say it doesn’t have influence? This secret wish for the blowout ending?
2016
On May 31, 1945, U.S. Secretary of War Henry Stimson called a meeting of experts to advise President Harry Truman on the atomic bomb: Should we use it or not? J. Robert Oppenheimer, the scientist heading the Manhattan Project, was asked to explain the difference between the new bombs and the firebombs already in use. That spring, General Curtis LeMay had been firebombing Japan with napalm, a highly flammable and “sticky” mixture of gasoline and gelling agents. Almost a million people in sixty cities were “scorched, boiled, and baked to death,” in LeMay’s own words, in these napalm raids. It must have been hard to believe that the A-bomb could be dramatically more deadly—so what would it accomplish?
Oppenheimer’s response was that anything living within two-thirds of a mile of the atomic bomb’s blast site would be irradiated, and further, the appearance of the explosion would have its own impact. The meeting notes read: “It was pointed out that one atomic bomb on an arsenal would not be much different from the effect caused by any Air Corps strike of present dimensions. However, Dr. Oppenheimer stated that the visual effect of an atomic bombing would be tremendous.”
At the time, this was purely theoretical. But six weeks later, Oppenheimer was present for the Trinity test, the first detonation of a nuclear weapon, in the desert of New Mexico. On that early morning of July 16, 1945, after an incredibly bright explosion (witnesses without eye protection were temporarily blinded), the light turned white, then red, then purple. This “purple luminescence,” the effect of ionized atmosphere, smelled like a waterfall. The physicist Robert Serber said that “the grandeur and magnitude of the phenomenon were completely breathtaking.”
The people who worked on the bomb understood that some of its power was symbolic—that the difference between nuclear warfare and previous classes of weaponry was partly aesthetic. Stimson even worried that the power of the symbol might be lost if the bomb were dropped on an already devastated country. He wrote in his diary, “I was a little fearful that before we could get ready the Air Force might have Japan so thoroughly bombed out that the new weapon would not have a fair background to show its strength.” But Oppenheimer was right about the tremendous effect. The bombs the United States dropped on Hiroshima and Nagasaki felt qualitatively different, even if, in the end, the death toll didn’t match that of the firebombs. As Laurens van der Post, then a prisoner of war in Japan, said, there was “something supernatural” about the atomic blasts.
I’ve often heard that the residents of Hiroshima were warned about the bomb—that the military dropped leaflets on the city instructing them to evacuate. This is something of a myth. The warnings were vague and not specific to any particular city; LeMay had been dropping leaflets with lists of possible bomb targets for weeks. Although the people of Hiroshima were preparing for attack, they had expected more firebombing and were clearing out fire lanes. They heard air-raid sirens on the morning of August 6, but they heard those every morning. They were not prepared for an entirely new kind of weapon, and the new kind of terror it would bring. As M. Susan Lindee puts it in Suffering Made Real: American Science and the Survivors at Hiroshima, “They had been eating an orange, working in a garden, or reading a book. Minutes later they wandered, without feeling, past corpses, neighbors trapped in burning mounds of rubble, or children without skin.”
The Japanese word for the survivors of the bombings at Hiroshima and Nagasaki is hibakusha. This is not the word for “survivor.” It is usually translated as “bomb-affected people” or “explosion-affected persons”—a euphemism, almost politically correct. They avoid the more direct term seizonsha (“survivors”) because, as John Hersey writes in Hiroshima, “in its focus on being alive it might suggest some slight to the sacred dead.”
This sounds well intentioned, but for all its sensitivity toward the departed, the term in practice placed a stigma on the living, who were feared and considered unclean. The Wikipedia page for hibakusha shows a woman with black cross-hatchings on her back and arms—the pattern of the kimono she was wearing burned into her skin. The hibakusha were not inclined to identify themselves as such because it made them less employable and marriageable. There was little financial incentive either, since the Japanese government didn’t offer the victims health care or other compensation until 1957.
I read Hiroshima in junior high, and the detail I always remembered most clearly from Hersey’s account of the hibakusha was that their eyeballs melted. Those words, that image. I have remembered and re-remembered it so many times—their eyeballs melted—that I started to think it was a false memory, an invention of my imagination. It seems possible only as a metaphor, but it isn’t. On page 51:
On his way back with the water, he got lost on a detour around a fallen tree, and as he looked for his way through the woods, he heard a voice ask from the underbrush, “Have you anything to drink?” He saw a uniform. Thinking there was just one soldier, he approached with the water. When he had penetrated the bushes, he saw there were about twenty men, and they were all in exactly the same nightmarish state: their faces were wholly burned, their eyesockets were hollow, the fluid from their melted eyes had run down their cheeks. (They must have had their faces upturned when the bomb went off; perhaps they were anti-aircraft personnel.)
This passage informed my entire conception of war. For decades, I have found it difficult to accept that the bombs were necessary. The logical argument has trouble competing with the emotional impact of that etched-in detail.
Now, in its one-sidedness, the little yellow paperback with a red sun on the cover has the whiff of propaganda—but propaganda about what? Is it against nukes or war in general? Was the war necessary? Chillingly, I’ve had the same feeling, that I’m looking at propaganda, in Holocaust museums. How are we to compare these two horrors, if it’s even possible? Am I supposed to choose sides?
Reading about the Hiroshima and Nagasaki attacks, I see propaganda everywhere—Axis or Allies, pro- or anti-war. The persistent belief that the cities were warned—isn’t that American propaganda? A kind of victim-blaming, as in, they had their chance to escape? In the month before the attacks, Truman wrote in his diary (I’m almost touched that these men of war kept diaries):
Even if the Japs are savages, ruthless, merciless and fanatic, we as the leader of the world for the common welfare cannot drop this terrible bomb on the old Capitol or the new . . . The target will be a purely military one and we will issue a warning statement asking the Japs to surrender and save lives. I’m sure they will not do that, but we will have given them the chance.
This reads like rationalization, like self-propaganda: They deserve it, even if they don’t deserve it. We can’t do it, but we will. Later, after the bombing on August 6, Truman would say over the radio, “It is an awful responsibility that has come to us. Thank God it has come to us instead of our enemies, and we pray that He may guide us to use it in His ways and for His purposes.” When the journalist Wilfred Burchett visited Hiroshima in September 1945, he described the symptoms of acute radiation sickness (severe nausea, vomiting, and diarrhea; swollen, bleeding tissue; hair loss) and called it “atomic plague.” American scientists thought this was Japanese propaganda; they believed that if you were close enough to be irradiated, you’d be dead.
In 1980, The New York Review of Books published a letter to the editors and a response to that letter under the title “Was the Hiroshima Bomb Necessary?” In 1981, Paul Fussell wrote that it was “surely an unanswerable question.” This was in an essay first published in The New Republic
