Accelerate! - James Brooke-Smith - E-Book

Accelerate! E-Book

James Brooke-Smith

0,0

Beschreibung

The 1990s was the decade in which the Soviet Union collapsed and Francis Fukuyama declared the 'end of history'. Nelson Mandela was released from prison, Google was launched and scientists in Edinburgh cloned a sheep from a single cell. It was also a time in which the president of the United States discussed fellatio on network television and the world's most photographed woman died in a car crash in Paris. Radical pop band The KLF burned a million quid on a Scottish island, while the most-watched programme on TV was Baywatch. Anti-globalisation protestors in France attacked McDonald's restaurants and American survivalists stockpiled guns and tinned food in preparation for Y2K. For those who lived through it, the 1990s glow in the memory with a mixture of proximity and distance, familiarity and strangeness. It is the decade about which we know so much yet understand too little. Taking a kaleidoscopic view of the politics, social history, arts and popular culture of the era, James Brooke-Smith asks – what was the 1990s? A lost golden age of liberal optimism? A time of fin-de-siècle decadence? Or the seedbed for the discontents we face today?

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern
Kindle™-E-Readern
(für ausgewählte Pakete)

Seitenzahl: 573

Veröffentlichungsjahr: 2022

Das E-Book (TTS) können Sie hören im Abo „Legimi Premium” in Legimi-Apps auf:

Android
iOS
Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



First published 2022

The History Press

97 St George’s Place, Cheltenham,

Gloucestershire, GL50 3QB

www.thehistorypress.co.uk

© James Brooke-Smith, 2022

The right of James Brooke-Smith to be identified as the Author

of this work has been asserted in accordance with the

Copyright, Designs and Patents Act 1988.

All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without the permission in writing from the Publishers.

British Library Cataloguing in Publication Data.

A catalogue record for this book is available from the British Library.

ISBN 978 1 8039 9149 8

Typesetting and origination by The History Press

Printed and bound in Great Britain by TJ Books Limited, Padstow, Cornwall.

eBook converted by Geethik Technologies

Contents

Acknowledgements

1     Pre-Post-Everything

2     Surfing at the End of History

3     Desert Spectacular

4     Middle of the Road

5     Generation X, Y, Z

6     You Take Me Higher and Higher

7     Moore’s Law

8     Into the Void

9     Shock, Horror!

10   Lonely Planet

11   Crash

12   Prizes for All

13   Irrational Exuberance

14   False Apocalypse

15   The Falling Man

Notes

Select Bibliography

Select Filmography

List of Illustrations

Acknowledgements

Thank you to everyone who helped over the course of this book’s writing. Thanks to my research assistants, Jason Liboiron, Anthony Matarazzo, Kirsten Bussière, Alan Orr and Ryan Pepper. Thank you to Ryan, in particular, for help with last-minute fact-finding missions and copy-editing conundrums. A big thank you to Sally Holloway, my agent, and to Simon Wright and Mark Beynon, my editors at The History Press, for believing in the project and helping to whittle the manuscript into shape. But, most of all, a huge thank you to my family, Sara, Leo and Freya, for love and support and high jinks through the long days of lockdown. I love you.

1

Pre-Post-Everything

WRITING HISTORY by the decade is like trying to put a hairnet on an octopus. The more you try to squeeze the recalcitrant stuff of history into a neat ten-year slice, the more it oozes through the gaps. The forces that shape our lives run deeper and longer than any ten-year perspective can grasp. Dig deep enough into the causes of any historical event, and you find yourself slipping backwards through time in search of stable ground.

The decade is an arbitrary product of calendrical time, which stems from the accident of our having ten fingers and ten toes, the source of our decimal number system. Had our bodies evolved differently, bookshops might be stocked with volumes that divide the chaos of history into neat seven-, twelve- or fifteen-year chunks – all essentially arbitrary units.

Our penchant for histories of single decades is a relatively recent phenomenon, a product of the historical short-sightedness of the modern age and our desire to chop reality into bite-sized pieces. If you cast your mind back and try to remember the great decades of history, the further you go, the harder it becomes. The twentieth century is full of good, chunky decades. The ‘greed is good’ eighties, the anxious seventies, the swinging sixties, even the ‘low, dishonest’ thirties and the ‘roaring’ twenties. But once you get beyond, say, the 1890s – the era of the Oscar Wilde trial and the crumbling of the not-sotimeless Victorian verities – easily recognisable decades are hard to find. No one outside the academic conference circuit talks about the 1870s, let alone the 1470s.

And yet there’s no getting past the fact that we have ten fingers and ten toes. As a historical unit, the decade has a pleasing weight to it. It fits nicely in the palm of your hand. A single decade usually contains at least one or two changes of government, one or two transformative technological breakthroughs, some major new social trends, a few significant cultural movements and a handful of deaths of major historical figures. The decade may be an arbitrary construct, but it is, as the anthropologists say, good to think with.

This is particularly true of the 1990s, the subject of this book. The 1990s come ready packaged as a decade – not quite a ten-year slice, but near enough. The 1990s began in the autumn of 1989 with the fall of the Berlin Wall and the rapid, almost entirely bloodless collapse of the Soviet Union. And they ended on 11 September 2001 with the attacks on the World Trade Center in New York, a spectacular assault launched by a millenarian Islamic sect seeking to end the hegemony of America and the Christian West and replace it with a global caliphate. On the one hand, a tumbling wall and the end of the Cold War; on the other, collapsing towers and the beginning of the ‘global war on terror’.

My aim is to explore what happened in between these two epoch-making events. For many in the affluent West, the 1990s were a time of relative political stability and cultural optimism, perhaps the last such epoch in recent memory. This was the decade in which the American political scientist Francis Fukuyama declared that history had ended with the fall of communism and that liberal democracy would inevitably spread throughout the world. The election victories of Bill Clinton in the US and Tony Blair in the UK signalled the triumph of ‘third-way’ politics, which sought to replace the ideological divisions of the past with an inclusive vision of modernisation and progress. Political parties of both the left and right sought to occupy the political centre ground. This was a much-touted ‘post-ideological’ age, in which politics fused ever more intimately with the public relations and media industries. Class consciousness was passé; turnout at elections declined across much of the Western world.

And yet Harold Macmillan’s famous statement about the role of the unexpected in history still held true. In 1963, when the British prime minister was asked by a journalist what could possibly knock his reforming government’s plans off course, he replied, ‘Events, dear boy, events.’ Over the course of the 1990s, the events piled up as usual. Presidents Bush and Clinton deployed US forces overseas on no fewer than seven separate occasions, from the First Gulf War in 1991 via Somalia, Haiti, Bosnia, Iraq again in 1998, Sudan, Kosovo, and then the invasion of Afghanistan after 9/11. The 1990s saw the fall of dictators, from Pinochet in Chile to Suharto in Indonesia; the release of Nelson Mandela from prison and the end of apartheid in South Africa; the 1998 Good Friday agreement and end of the Troubles in Northern Ireland. Genocides in Rwanda and Kosovo signalled a grim return of the systematic mass slaughter that had characterised mid-twentieth-century European history. While Europe and North America enjoyed the dubious pleasures of consumerist anomie and postmodern weightlessness, history still raged elsewhere.

After an early period of recession, the 1990s witnessed one of the longest cycles of economic growth in modern times, roughly from 1992 until the collapse of the dot-com bubble in 2000 and the shock of the World Trade Center attacks in 2001. The long boom was fuelled by technological change and productivity increases, but in many countries, especially the US and the UK, the Anglo-Saxon centres of free-market economics, it was also driven by the deregulation of vast swathes of the economy, in particular banking, telecommunications and energy. In many ways, the 1990s were a slicker, more tasteful version of the 1980s, a period of full-throated free-market economics, but this time overseen by centrist politicians who wanted to channel some of the proceeds of growth into social programmes such as education and health.

The 1990s were also the great decade of globalisation, both as a geopolitical and as a cultural process. The European Union (EU) came into being in 1993, the North American Free Trade Agreement (NAFTA) in 1994 and the World Trade Organization (WTO) in 1995. The International Monetary Fund and the World Bank advised governments in the developing world on how to reform their economies along neoliberal lines, often threatening to punish those who did not by withholding credit. At WTO meetings in Seattle, Prague and Genoa there were large demonstrations in opposition to the free-market model of globalisation. Loose coalitions of trade unionists, religious groups, anarchists and environmentalists protested the lack of democratic oversight, the erosion of workers’ rights and the heavy environmental cost of global capitalism. Nativist politicians in Europe and America started to make space in the public sphere for virulently anti-immigrant views, the likes of which had not been heard for decades. Islamic extremists used hyper-modern communication technologies to form decentralised networks and spread an anti-modern ideology of religious purity and violent jihad.

In the 1990s the future seemed to arrive on an almost daily basis with the rise of the internet, the spread of mobile phones, the first cloned mammal, Dolly the Sheep, and the mapping of the human genome, a pioneering feat of human ingenuity that was achieved ahead of schedule due to the exponential growth of micro-processor speeds. Pundits and theorists thrilled to the progressive possibilities of the new digital age: Manuel Castells analysed the emerging forms of the ‘network society’, Nicholas Negroponte extolled the human potentials of ‘being digital’, and cyber-theorists took up Donna Harraway’s techno-feminist vision of ‘A Cyborg Manifesto’. Before the Big Five tech corporations (Apple, Amazon, Google, Microsoft and Meta (Facebook)) monopolised the internet, cyberspace represented a new frontier of human possibility. Wired magazine, Mondo 2000 and online communities like The WELL (The Whole Earth ’Lectronic Link) espoused a libertarian hacker ethos that cast personal identity as a malleable construct and human societies as problems to be fixed with smart technology.

Many of the decade’s most exciting cultural forms emerged from the creative use of new digital technologies, from hip hop’s culture of sampling to the futuristic sounds of rave and techno music to Hollywood movies such as Jurassic Park, Toy Story and The Matrix. Videogames became both a mass-market industry, which rivalled movies and recorded music for market dominance, and, in some quarters at least, a critically recognised art form with its own canons of value and taste. This was also the era in which the underground went mainstream. Grunge, hip hop, Brit Pop, the indie cinema of David Lynch, Danny Boyle and Jane Campion, the edgy conceptualism of the Young British Artists, extreme sports: what in previous decades were the preserves of alternative scenes and hipster elites rushed into the overground cultural spaces of MTV, broadsheet newspapers, multiplex cinemas and publicly funded art galleries. On the one hand, this produced a sense of cultural insurgency as avant-garde forms seeped into public consciousness and disrupted the status quo; on the other, it spurred debates about ‘selling out’ and the dangers of co-option by the corporate media.

The question that faces the historian of the recent past is this: does a pattern emerge? Is there a figure in the carpet, a discernible order to this otherwise chaotic selection of events and trends? Or, to use a more historically apt metaphor, does a clear image emerge from the Magic Eye picture that is the 1990s? These optical illusions could be found everywhere from high street malls to popular kids’ magazines for a brief moment in the early 1990s. If you stared for long enough at the dense, staticky mess of dots and scribbles that comprised the image, a second, three-dimensional image would reveal itself. What looked like an error message from an industrial printer became a shimmering hologram of a spaceship or a fantasy landscape. The craze was short lived and yet the Magic Eye poster serves as a neat metaphor for the era, not only because it mixed a vaguely hippy-ish, Op Art visual style with the technophilia of the early internet age – a characteristically 1990s combination – but also because it was so difficult to see clearly. Often it took hours of cross-eyed staring before the holographic elves or sports car revealed themselves. And sometimes, nothing happened at all. Sometimes, in spite of the promise that a hidden vision lurked within, the surface remained stubbornly flat and meaningless.

At the beginning of the nineteenth century, Romantic historians sought to encapsulate the vast, churning forces of history in the form of single resonant images or heroic individuals. William Hazlitt wrote a series of biographical sketches of luminaries such as Jeremy Bentham and William Wilberforce, who he thought embodied the ‘spirit of the age’. Today, we are rightly sceptical of such ‘great man’ theories of history; we are more attuned to the impersonal forces that shape historical experience – the social structures and environmental conditions that affect our behaviour and configure our identities. Not even the greatest of statesmen, nor the most famous of historical personages, are present at all of the key moments of their era. We might call this the Zelig paradox, after the Woody Allen character who is miraculously present at all the major events of early twentieth-century history, but only at the cost of having no personality himself, of always blending in with the time and place in which he finds himself.

But distinct periods, even distinct decades, still have their own moods and atmospheres, their own local historical weather patterns. Whether or not they add up to a unitary ‘spirit of the age’, there are always shared points of reference, common experiences that anchor us in the stream of time. The fall of the Berlin Wall; the last days of British rule in Hong Kong; the end of apartheid in South Africa; the experience of logging onto the internet for the first time; the first mobile phone in your pocket; the thrill of being swept up in the insurgency of youth cultures like hip hop, grunge and rave – these are just the some of the things that shaped what it was like to be alive in the 1990s.

The best way to capture this sense of history as both impersonal force and subjective experience is via culture. We can learn as much, if not more, about the 1990s from the history of the first-person shooter videogame or the beginnings of online pornography, as from yet another study of Bill Clinton’s presidency or the signing of the Maastricht Treaty. Or, better still, it is via the juxtaposition of these two kinds of things – world events and popular culture, CNN and MTV – that we can get a real feel for the age. Because this is how most of us live in the modern world, constantly jumping back and forth between news and entertainment, the serious and the silly, the high and the low.

If the 1990s can be defined at all, then they are surely the ‘pre-post-everything decade’, the cusp between the analogue twentieth century and the fully digital twenty-first. This was an era in which we still read physical newspapers, called our friends on a landline and travelled to bricks-and-mortar stores to buy the latest CDs, books and videotapes. But it was also a time in which there was a growing awareness of the vast new technological system that was powering into life, the radically transformative effects of which we could barely fathom at the time. This was the last era before all that was solid melted into the digital air. Before online trolls and flame wars. Before the great siloisation of politics and culture into algorithmic filter bubbles. Before file sharing and streaming services turned music, film and television into an endless torrent of cheap content. And before the fragmentation of mass media eroded the very idea of a shared popular culture with a productive tension between mainstream and avant-garde, centre and margins.

No doubt every modern era is fated to think it is undergoing a unique period of acceleration. As new technologies emerge and the forces of economic production advance, we feel as though everyday life itself is gathering pace. Nevertheless, in the 1990s the confluence of technological change, a prolonged economic boom, the rapid expansion and interconnection of the global economy and post-ideological ‘third-way’ politics sparked a sense that a new world was coming into being. Looking back from the present, it is impossible not to be struck by the sense of social and cultural effervescence that defined so much of the era. It was as though history’s source codes had been scrambled, the grand narratives of the twentieth century cast aside, but nothing of appropriate gravity put in their place. Into this vacuum rushed all manner of utopian dreams, millenarian fantasies, liberal triumphalisms and retro nostalgias.

Today, we are older if not necessarily wiser. We live in a post-9/11 world. We have watched as the blithe statements of American politicians about clean and quick military interventions have turned into decades-long conflicts in Afghanistan and Iraq. We have lived through the greatest financial crisis since the 1920s and the subsequent years of economic austerity. We have witnessed the return of authoritarian populism and the emergence of China as a superpower to rival the US. We have seen the digital utopianism of the early days of the internet give way to fears about surveillance capitalism, social-media-driven anxiety and attention obliteration. We have heard the warnings about global climate catastrophe go unheeded again and again.

One of the central historical debates about the 1990s hinges on whether we should regard the decade as a lost silver age (certainly not gold but better than the tin one we have now) of liberal tolerance and political consensus or as a seedbed for the civilisational discontents we face today. After much consideration and several years of diligent research, I can now finally reveal the answer: yes and no, both and neither. Yes, given the state of the world today, many of us would welcome a return to the rather bland political landscape of the 1990s. And yes again, it is possible to detect in that decade the emerging forms of some of our current discontents. But both of these positions are limited by their insistence on judging the 1990s solely in relation to where we are today. Perhaps it would be better to try to return to the 1990s some of their historical quiddity, to try to see the decade on its own terms. Even the very recent past is like a foreign country … and yet it only seems like yesterday.

2

Surfing at the End of History

WHEN SOMEONE asks what my favourite book was when I was young, I’m usually able to come up with an appropriate-sounding answer on the spot. ‘Ooh,’ I respond, ‘that’s a great question,’ while I perform a quick mental search for something that has just the right combination of intellectual credibility and juvenile plausibility. ‘It’s hard to choose just one example,’ I simper, ‘but it would probably have to be Albert Camus’s The Plague or Iris Murdoch’s The Bell.’ I know I’m on stable ground with a response like this, not only because these really were among my formative teenage reading experiences, but also because they sound so exactly like the kind of existentialist fiction that would capture an adolescent’s emerging moral imagination.

But while my answer has the ring of authenticity, it’s only a half-truth, an artful selection from the available facts, designed to showcase the precocious side of my teenage self. The truth is that I spent vastly more time as a kid watching television than reading books. I came of age as a morally self-aware human being in an atmosphere of ubiquitous television. There were moments when I got lost in books, but for the most part I spent the time when I wasn’t at school or hanging out with friends slumped in front of the TV. I was a binge watcher in the era before digital super-abundance. With only four channels available on analogue TV throughout most of my teens, I simply watched what was on, at great length. I watched American sitcoms, Australian soap operas, gritty crime dramas, sumptuous costume dramas, the lunchtime news, the Six O’Clock News, the Nine O’Clock News, investigative reporting, docudramas, classic movies, lawn bowls, horse racing, American football, those weird five-minute experimental slots that came on after Channel 4 News, rural affairs programming, game shows, more American sitcoms and, because it was the 1990s, I watched Baywatch.

Baywatch was the show of the 1990s, a high-gloss encapsulation of the Californian ideology – personal freedom, hedonistic lifestyles, softcore humanitarianism – which at its height reached a billion weekly viewers around the world. You’d struggle to find a better emblem of the age of globalisation than an ‘action drama’ shot on location in Will Rogers State Park in Southern California that was broadcast in all of the world’s regional television markets and watched by 18 per cent of the earth’s population. There may have been a few slippages when the show’s American English was translated for foreign audiences – Alerte à Malibu! Mishmar Ha-Mifratz! – but the theme song was pure Esperanto, a joyous surge of energy and desire that was instantly comprehensible from Quito to Tehran. It was paired with what was reliably the best part of the show, the opening credit sequence, a delirious montage of pounding breakers, infinite beaches, chiselled torsos, spinning life floats, cleavage of almost geological proportions and repeated shots of lifeguards launching themselves into the surf from a variety of fast-moving watercraft. For at least a couple of minutes each week it really did seem, as the theme song assured us, that ‘it’s gonna be alright’.

Once you’d come down off the high of the credit sequence, though, it didn’t take long to realise that the actual content of the average Baywatch episode was pretty forgettable. The show was dropped by the US network TV giant NBC after its first series in 1990 due to lacklustre ratings, but continued until 2001 as a low-cost, high-volume title for the syndication market. As with so many commercial TV shows, Baywatch was designed as a kind of live-action wallpaper that would capture viewers’ attention for just long enough to guide them between what were, financially speaking, the main events: the frequent commercial breaks.

The narrative formula was simple. Each episode intertwined two main plots. Plot A usually involved some sort of action-adventure scenario. Hobie’s delinquent cousin surfs too close to the condemned San Dimas pier and has to be rescued. Partygoers on an illegal offshore casino come to blows, fall overboard and have to be rescued. An underwater photographer is attacked by what seems to be a sea monster and the crew investigate. As with many an exotically located American crime drama – Magnum, P.I., Hawaii Five-O and Miami Vice spring to mind – it’s hard to grasp why the frequency of natural disasters and criminal conspiracies hadn’t seriously undermined Malibu’s property values and driven the neighbourhood into disrepute. Plot B tended to be more personal, often revolving around an emotional issue in one or more of the crew’s private lives. Romance, parenthood, adolescence, divorce, body image, unexpected pregnancy, drug abuse, eating disorders, animal welfare – that kind of thing. This was part of the Baywatch formula: action adventure combined with a social conscience, all wrapped up in a glossy package that fatally undercut any serious intentions the show’s writers might have had.

In Britain, the show aired on Saturday afternoons around 5 p.m., in the sweet spot between whatever afternoon activity had taken place and the evening meal. I know this can’t actually have been the case, but in my memory I’m always watching Baywatch in the autumn or winter. The nights have drawn in, it’s cold outside and I’ve recently got back from some kind of bracing outdoor pursuit – school sports, perhaps, or a family hike, something that heightens the sense of warmth and comfort when you finally get back indoors. Instead of a log fire, I warm myself in front of the glowing TV screen, which transmits a diluted but still powerful simulacrum of the Southern Californian sunshine.

The novelist David Foster Wallace once compared the television set to an ‘overlit bathroom mirror before which the teenager monitors his biceps and determines his better profile’. The problem with TV, according to Foster Wallace, is that it produces false representations. It is a cracked mirror, which reflects not the world as it is, but our desires about how we’d like the world to be. If we stare into this distorted mirror for too long, we start to base our own sense of identity on the phantom projections that stare back at us. Soon, we’re in the fun house of late twentieth-century postmodern culture, where the boundary between simulation and reality is meaningless and our desires are no longer our own. ‘Who am I?’ and ‘What do I want?’ become irresolvable philosophical conundrums.

So, what exactly did I learn from watching Baywatch as a kid? How did it shape my emerging moral imagination and burgeoning sense of self? Albert Camus and Iris Murdoch’s insistence that man’s ultimate fate is to be free, that we are all faced with the stark reality of our own moral agency, has stayed with me for the long term. But what have I retained of Mitch, Hobie, Stephanie and CJ’s weekly struggles? Not a lot, if truth be told. And yet that probably misses the point of Baywatch, which was always more of an ambient than a narrative experience. Plot details unfolded, but nothing ever really happened. As you tuned in each week, it quickly became clear that whatever catastrophe befell the Baywatch community, its way of life would continue unimpeded, possibly forever. SoCal beach life seemed like one of the few steady states in the history of human civilisation, an enduring form of social organisation that could easily outlive the slings and arrows of weekly catastrophe. The viewer simply went with the flow of experience, like a surfer launching himself into the ocean’s current.

If Baywatch represents the ultimate form of self-enclosed televisual spectacle, then its polar opposite would surely be the images that were broadcast from the fall of the Berlin Wall in November 1989. This was history happening in real time, an epoch-making event fuelled by the collective bravery of ordinary people that was broadcast to billions of viewers around the world. Baywatch abounded with pseudo-events that unfolded with the unreal logic of serial television. The images from Berlin, by contrast, were palpably, authentically real. Crowds of people pouring through checkpoints. Strangers hugging each other and crying in the streets. Stony-faced border guards looking on in resignation. And the most memorable images of all: crowds of people atop the graffiti-covered wall, chipping away with hammers and chisels at a piece of infrastructure that not only marked the border between two halves of a divided country, but was also a psychic fault-line within twentieth-century history, the symbolic frontier between East and West, communism and capitalism, Them and Us. We watched the live broadcast at school as a kind of immersive history lesson, assured by our teacher that this event would shape all of our lives for years to come. A new era was being born before our eyes. Its final form was still unknown, but the old oppositions that had shaped politics for half a century were finished. The implication, of course, was that it was us in the West who were triumphant. Our side had won. We’d been right all along.

Contrary to Gil Scott-Heron’s famous line, the revolutions of 1989 were televised. The presence of TV crews on the streets of Berlin, Prague, Warsaw and Bucharest acted both as a shield against repressive police actions and as an accelerant that fanned the flames of resistance. In June of that year, the violent suppression of pro-democracy campaigners in Tiananmen Square in Beijing had played out in front of the world’s media. In East Germany a few months later, Erich Honecker’s deputies used the example of Tiananmen to dissuade their leader from using force against his own unruly citizens. The world was watching. And, more importantly, the citizens themselves were watching. After years of propaganda and surveillance, TV images of crowds massing in the streets offered glimpses of an alternate reality that had been hidden from view. They helped to break the spell of resignation and passivity that had held together a crumbling political system. In this case, television was not the distorting mirror that Foster Wallace warned against, but a powerful tool for revealing the truth about the world.

In Berlin, the trigger that finally caused the Wall to fall was a gaffe made at a televised press conference by Günter Schabowski, a prominent member of the East German Politburo. Schabowski had been wheeled out to announce a hastily drafted law that relaxed travel restrictions to the West, a move designed to end weeks of demonstrations and allow dissident troublemakers to leave the country. When he was asked by an Italian journalist when the decree would come into effect, Schabowski stumbled and said that, as far as he knew, it took effect immediately. Within the hour West German TV announced that the border was open and masses of East Germans were heading for the exits.

One of the defining moments of the Romanian Revolution, which unfolded the following month, December 1989, was the capture of the National Television Tower, a giant hulk of Brutalist concrete from which the Ceauşescu regime broadcast its propaganda. The first revolutionaries on air were Ion Caramitru and Mircea Dinescu, an actor and a poet, who declared the end of the regime and appealed for calm. Even if you don’t speak Romanian, you should watch the speech on YouTube. It is a wonderful slice of history in the making. But what is it that gives this footage such a clear stamp of authenticity? Is it the drab clothes that the revolutionaries wear – the rough shepherds’ jackets, the patterned sweaters, the pleated jeans? Is it the quality of the filmstock itself, the flat, grainy texture of poor-quality videotape that paradoxically marks these images as real? Or is it the timbre and rhythm of their voices as they deliver their hastily prepared speeches? It is possible to make out some of the key phrases in a garbled kind of way – populisce, momente, dictutora fugit, victorios! – but what comes across most clearly is the mixture of moral urgency and existential doubt with which they deliver those words, as though they can hardly believe this is really happening, can hardly believe that they are, at least for now, in the driving seat of history.

The historian Timothy Garton Ash has called the year 1989 ‘one of the best in European history’. Across Eastern Europe, popular movements deposed repressive governments and set in train the domino effect that would see the collapse of the Soviet Empire within a couple of years. But change came from above as well as below, from the party nomenklatura as well as the streets. The last days of communism were a multilateral tug-of-war between the communist old guard, enlightened elements within the party who sought to speed up the process of reform and pro-democracy dissidents who had been throwing stones from outside the party machine for decades.

There were also powerful global actors, who exerted influence from afar: the Catholic Church and its beloved Polish pope, Karol Wojtyła (John Paul II); Ronald Reagan and George H.W. Bush, successive ‘leaders of the free world’, who, after years of mounting tension, chose in the late 1980s to engage diplomatically with the USSR and move toward nuclear disarmament; and, perhaps most important of all, Mikhail Gorbachev, a new kind of affable and humane Soviet leader, who began the processes of perestroika, or the reform of the Communist Party, and glasnost, a new openness and transparency to be achieved via the relaxation of constraints on the media. Gorbachev also formulated Russia’s new ‘Sinatra doctrine’ for the Eastern Bloc: ‘Go your own way.’ In 1991, a group of ageing Russian generals mounted a putsch while Gorbachev was away at his dacha on the Crimean peninsula, but the coup quickly fizzled out when the troops whose tanks encircled the Duma sided with the large crowds of protestors that gathered in the streets. This was, in the words of an American State Department official, ‘the last hurrah of the apparatchiks’. Before too long, Boris Yeltsin arrived to break up the siege and stake his own claim as the guardian of Russian democracy. By the end of the year the Soviet Empire was no more. On 31 December, the hammer and sickle of the Soviet flag was replaced atop the Kremlin by the horizontal white, blue and red flag of the Russian Federation.

The unprecedented speed and ease with which Soviet communism was dismantled lends the events of 1989–91 an almost magical aura. What had seemed for decades to be one of the great immovable forces in world politics exited the stage without the mass mobilisation of troops or significant bloodshed (there was fighting in the streets and numerous deaths in Romania, but the violence was relatively small scale and did not spread to neighbouring states). Unlike the events in Paris in 1789 and Moscow in 1917, the velvet revolutions of 1989 eschewed revolutionary ideology in favour of a more peaceable, humane demand for freedom from oppression. Many of the dissident leaders were drawn from the ranks of the intelligentsia, a class that had resisted Soviet rule by writing samizdat literature and embracing the ‘internal exile’ of reading and friendship. In Prague, the uprising began when students at the National School for the Dramatic Arts went on strike. In the following weeks, the Magic Lantern Theatre became the centre of operations for Václav Havel’s Charter 77 movement, a dissident pressure group which improvised press releases and policy statements as though they were workshopping a forthcoming play. For a brief moment, as with the National Television Tower in Bucharest, poets and artists were in charge of the signal, broadcasting their message of hope and freedom around the world.

It was in this heady atmosphere that the American political scientist Francis Fukuyama made his notorious argument about the ‘end of history’. Fukuyama originally made this bold – some might say foolhardy – claim in a specialist political science journal in the summer of 1989, a fact that only added to his intellectual glamour, as though he had somehow foreseen the events that were to unfold in Berlin later in the year. The full version of the argument was published in book form as The End of History and the Last Man in 1992, by which time his claims about the historical inevitability of the triumph of liberal democracy over all other political systems seemed even more plausible. Because that is, ultimately, what Fukuyama was proposing. As a matter of historical necessity, communism had to fail and liberal democracy had to win. The timing was up for grabs, but the outcome was inevitable. With the collapse of the Soviet Union, history was coming to a close.

Flat-footed critics mocked Fukuyama’s theory for its patent absurdity. They took him to task for projecting a future without historical events, a future in which there would be no more palace coups, popular uprisings, religious revivals or scientific discoveries. How can history end, they laughed, if time keeps rolling ever onwards? Surely something is bound to happen? But Fukuyama didn’t predict the end of historical events, the complete cessation of stuff happening. Instead, he argued that future events would unfold against a settled background of political theory. Fukuyama pointed out that over the course of history the number of different types of political organisation had been dwindling. Over the long term, humankind had been engaged in a giant collective experiment, testing out different forms of political organisation to see which most fully met its needs and desires. Hunter-gatherer bands, tribal councils, early tax-raising agrarian states, aristocracy, monarchy, theocracy: these governmental forms had all been tried on for size and rejected due to their inescapable deficiencies. After November 1989, it looked like there was only one game left in town. Fascism had been discredited by the middle of the twentieth century, and now communism was on its way to historical irrelevance. Free-market capitalism allied to liberal democracy was the framework within which history would unfold from here on. Fukuyama acknowledged that wars and politics and social movements – all of the strife and clash of billions of humans occupying the same small planet – would continue; but he insisted that the fundamental questions about political institutions had been settled, once and for all.

As it turned out, Fukuyama was an even stranger creature than most of his critics recognised. He was no run-of-the-mill Washington policy wonk, one of the standard-issue blazer-and-chino realists who could often be seen on cable news proclaiming American-style democracy to be the best available means of minimising inter-state conflict and maximising aggregate global happiness. Fukuyama was a singular being, an anti-bourgeois neoliberal who furnished a whole-cloth theory of human nature and civilisational history for the North Atlantic alliance. In the media hype that followed the publication of his theory, most commentators focused on the first part of his title, The End of History …, and debated the validity of his claim that liberal democracy would continue inexorably to spread throughout the world. Significantly less attention was paid to the second half, … and the Last Man, no doubt because it was an allusion to Hegel’s notoriously obscure philosophy of history.

In fact, Fukuyama was intervening in a long-standing debate between two of the heavyweights of nineteenth-century German political philosophy, Marx and Hegel. Of these two greats, claimed Fukuyama, it was Hegel who had got things right. The arc of history bent not towards a Marxist utopia, as the revolutionaries of 1917 had claimed; the final destination of man’s historical journey was not the triumph of the global proletariat and tractors for all. Instead, it was the bourgeois liberal nation state, a kind of permanent EU of the soul. Hegel claimed to have seen this historical destiny embodied in the all-conquering figure of Napoleon at the battle of Jena in 1806, whom he famously dubbed ‘the world spirit on horseback’. With the collapse of the USSR, the horses were back in the stable and the campaign was all but complete.

All of which makes The End of History and the Last Man not simply an expression of American triumphalism in the wake of communism’s demise, but also a rebuttal of postmodern relativism. Contrary to the post-1968 generation of continental philosophers, such as Michel Foucault and Jacques Derrida, who sought to undercut the great edifice of Western philosophy by stressing the relativity of truth and the illusion of selfhood, Fukuyama made a case for universal values. There is a single human nature, he claims. This human nature is the same across all historical periods, all geographical regions and all ethnic and cultural systems. Ethnicity and culture matter, but they are underpinned by a fundamental sameness that is rooted in our nature as human beings. And it just so happens that this universal human nature conforms almost exactly to the socio-political outlook of the United States of America. Liberal democracy and free-market economics are the rational end point of history because this is the only form of government that can satisfy the unchanging requirements of human nature. Stick that in your pipe and smoke it, Mister Fancy French Theorist.

Today, with the rise of political Islam, the success of authoritarian capitalism in China and Singapore, the spread of far-right nationalisms throughout Europe and North America and the apocalyptic threat of climate change, Fukuyama’s dream of the end of history seems like a curio from another age, an expression of liberal optimism that is almost unthinkable now. At the beginning of the 1990s, however, his theory echoed a new political orthodoxy that had been most succinctly articulated by Margaret Thatcher: ‘There is no alternative.’ In the politicised 1980s, this mantra had been not so much a descriptive statement as a performative one, which sought to create a new reality through the sheer power of naming it. Thatcher had used it as a rhetorical cudgel with which to undermine her enemies – the Labour Party, trade unions, left-wing local councils. By the early 1990s, however, TINA – the acronym lent a jovial, feminised air to the doctrine – had teeth. With the collapse of the Soviet Empire there was no actually existing alternative to capitalism. In the years immediately following the velvet revolutions, all of the former Soviet states converted to some form of market economy and representative democracy. The free-market ‘shock therapy’ that was administered to the sclerotic, centrally planned economies of former Soviet states was turbulent in the extreme, but at the time few commentators foresaw the rise of Putin and the klepto-oligarchic nationalist petro-state that Russia would later become.

On the other side of the world, China observed the events of 1989 and accelerated its own transition towards consumer capitalism. China’s market reforms had been launched as early as 1978 by Deng Xiao Ping with the formation of ‘special economic zones’ that permitted foreign investment, but it wasn’t until its 1992 national conference that the Chinese Communist Party officially announced its commitment to what it called a ‘socialist market economy’. The great Chinese experiment in capitalism without democracy was underway. At the time, many Western observers assumed that eventually the Chinese state would be forced to introduce democratic elections to accompany its market reforms. They tend to be less sanguine today. But back in the heady days of the early 1990s it was still possible, if you squinted hard enough at the Rorschach blot of global affairs, to see a trend towards a steady state of democratic harmony.

In The End of History and the Last Man, Fukuyama eschewed partisan judgements and analysed liberal democracy in the abstract; on first principles, as it were. Beyond the confines of the printed page, though, Fukuyama placed himself firmly on the neoconservative wing of the post-1989 liberal consensus. He was associated with the influential right-wing think tank Project for the New American Century, under whose auspices he co-signed letters in 1998 and 2001 urging Presidents Clinton and Bush to invade Iraq in order to spread democracy in the Middle East. In this sense, Fukuyama was at the cutting edge of both theory and practice in the post-historical age. It was but a small step from his elegantly argued thesis to the high-minded belligerence of the Neocons: history as a rational process guided from above by targeted air strikes from American fighter planes. After the debacle of the Second Gulf War, Fukuyama retracted his support for the Neocon project, casting its acolytes as a self-appointed Leninist vanguard who thought themselves justified in using violence to do the necessary work of history. This is one of the great ironies of the late twentieth century: after the collapse of the Soviet Union, it was the free-market fundamentalists who kept alive the ancient dream of a universal historical destiny for mankind.

Fukuyama may have claimed that history ended in 1989, but he looked all the way back to Plato for his definition of human nature. There are three essential components in the Platonic soul: logos, eros and thymos. Most of us are familiar with the first two of these terms. The opposition between reason and passion is wired deep into Western culture, present everywhere from introductory philosophy lectures to popular psychology books to guiding metaphors in literature and film. Eros and logos are the Captain Kirk and Lieutenant Spock of Western culture: diametrically opposed yet inextricably linked characters in the human story. Somewhat ironically, however, Plato’s third human component, thymos, is relatively under-represented in the general culture. Rarely outside of academic debate do we hear of the ‘thymotic drive’, let alone the excessive states of megalothymia (too much) and isothymia (too little). It’s ironic because thymos relates to the human need for recognition, our inherent desire to be acknowledged and respected by others.

In Plato’s Republic, thymos is usually translated as ‘spiritedness’. It encompasses the emotions of pride, shame, honour and resentment that underpin ethical judgements and are the cornerstones of social life. Within Plato’s ideal republic the thymotic aspect of human nature is aligned with the military class, just as logos is associated with the political class and eros with the productive class of merchants, craftsmen and farmers. The concept echoes throughout the history of Western philosophy. Machiavelli speaks of man’s desire for ‘glory’, Hobbes of our ‘vainglory’, and Rousseau of our ‘amour-propre’. Hegel describes the spark of perversity that causes a man to fight to the death simply in order to win prestige. Thymos denotes, then, the choleric, gingerish, spritely, up-and-at-’em part of the human soul that is at one moment so charming and the next such a pain in the ass.

As with all three of the component parts of human nature, thymos must be properly channelled and controlled in order to ensure social harmony. Unchecked it leads to domination and exploitation. The tyrant does not simply demand more power and wealth than his subjects, but also a grotesquely excessive tribute of recognition. Hence the giant palaces and bad public art erected by the likes of Joseph Stalin and Nicolae Ceauşescu to monopolise their subjects’ attention. But Fukuyama also recognises the presence of thymos in the plight of social minorities who seek recognition in the eyes of the state. The campaigns for Black, Indigenous, LGBTQ+, disabled, or women’s rights all emanate from the thymotic desire for recognition by our peers. This is why, according to Fukuyama, liberal democracy constitutes the rational end point of human history, as it is the only form of government that grants equal recognition to all citizens. Perhaps that is also why the TV images from the velvet revolutions of 1989 are so affecting, even at a distance of thousands of miles and over thirty years. When they poured into the streets and showed themselves before the TV cameras, the citizens of Berlin and Bucharest were acting in recognition of one another’s inherent dignity.

The challenge for liberal democracy, warned Fukuyama, lay in satisfying the public’s thymotic urges over the long term. After all, the forms of recognition on offer in your average democratic society tend to be of a fairly milquetoast variety. Basic human rights and the opportunity to cast a vote every four or five years constitute a decidedly minimalist conception of civic life. If this were a gym membership, it would be the silver package. The gold and platinum packages, which offer attractive perks such as mass-name recognition and boosted self-importance, are reserved for the likes of statesmen, oligarchs, artists and celebrities. This is one of the reasons why Fukuyama welcomed economic inequality as essential to the smooth functioning of liberal societies, as those individuals who exhibit megalothymic tendencies are able to win their peers’ recognition by becoming entrepreneurs and CEOs.

But what about the rest of us? How can the great mass of people in bourgeois liberal democracies express their thymotic urges? The opportunities for genuine, widescreen thymotic glory have been steadily diminishing over time. This is due to the inexorable triumph of what the sociologist Max Weber called the ‘iron cage’ of bureaucratic rationality. The history of the modern world is the history of the gradual bureaucratisation and professionalisation of what were once irregular and spontaneous activities. In ancient Rome, thymotic man might join an army to fight against the barbarian hordes. In Victorian Britain, he might help to annex a distant colony. In the golden age of aviation, he might don goggles and scarf and take to the skies in a balsa wood contraption. And in the post-historical 1990s? He went surfing. Or bungee jumping, sky diving, base jumping, zorbing, tombstoning, free climbing, deep-water soloing, hang gliding, kite surfing or any number of other extreme sports that surged in popularity during the decade. In many ways, extreme sports are the ideal outlet for humankind’s inherent need for acts of vainglory, as they combine public spectacle with the potential for death, or at least serious injury. In their most glorious moments, extreme sports stars exist outside historical time, suspended, as they fall, twirl, plunge and soar, in another mode of existence, another dimension of time that is attuned to the primal truth of man’s capacity for perverse acts of self-annihilating glory.

It was in the 1990s that extreme sports crossed over from the margins to the mainstream and became mass media spectacles. The first sign of the public appetite for extreme sports was the founding of the X Games in 1995 by the American broadcaster ESPN. Heavily sponsored by Mountain Dew, a carbonated soft drink brand owned by the mighty Pepsico, the X Games showcased sports such as skateboarding, bungee jumping, roller blading, sky surfing and other feats of hyper-cool daredevilry. The ‘X’ in the title was meant to refer to the extreme nature of the sports, but it was also pointed out that it could just as easily refer to the highly desirable ‘Generation X’ youth market, a demographic with money to burn but also a deep distrust of traditional advertising, that ESPN and Pepsico sought to reach. The second was the sudden ubiquity around 1997 of Red Bull, a caffeine- and taurine-based energy drink, which was marketed via a series of air races, snowboarding competitions and water-skiing challenges. At a certain point, it became impossible to determine whether Red Bull was an energy drink with dubious health benefits or an international extreme sports competition, the Champions League of cunning stunts.

The extreme sports lifestyle came with its own sartorial style: combat trousers, retro-branded sportswear, wrap-around shades, bleached hair, ubiquitous tattoos, multiple piercings. The look was part Special Forces commando, part New York break dancer, part tribal initiate. This was the sartorial embodiment of a philosophical ideal of how to live: intensely. Extreme sports were about dialling experience up to eleven. They were about reaching ever new heights of ‘peak experience’, a series of jumps, waves, climbs and plunges that wrung the very last drops of consciousness from the otherwise jaded human sensorium. This was the only way to feel genuinely alive in the era of godless, post-ideological, high-tech, bureaucratised, late-stage consumer capitalism.

Of course, many of these sports were invented long before the 1990s. The Polynesians were surfing the South Pacific waves for centuries before Captain James Cook arrived in the 1770s on HMS Resolution and recorded the practice in his diaries. The modern European taste for mountaineering was born around the same time and evinced the same fascination with the primal and authentic in nature. Shortly after Cook watched the surfers of the Sandwich Islands, Wordsworth, Shelley and the rest of the ‘visionary crew’ were scrambling up Mount Snowdon and Mont Blanc to enjoy the sublime vistas – both outside and inside the mind – that were visible from the summit. But the version of extreme sports that was popularised in the 1990s was a much more pumped-up, adrenal, madcap affair than the contemplative nature worship of the Romantic poets.

For the origins of this intensified version of extreme sports, we need only look back to the late 1970s and the formation of the Dangerous Sports Club, a group of posh daredevils who wanted to inject some aristocratic glamour into the otherwise drab undergraduate life of Oxford University. The group’s main activity was an annual downhill ski race in St Moritz, during which club members would wear black tie, drink copious amounts of champagne and race a variety of improbable contraptions down the slopes, including a grand piano and a four-poster bed. On 1 April 1979, three club members invented bungee jumping when they leapt off the Clifton Suspension Bridge in Bristol with only a length of elasticated ship’s rope attached to their waists. David Kirke, the club’s charismatic leader, claimed that their aim was to invent a sport ‘so beautiful, so absolutely beyond bureaucracy and so totally dependent on using one’s own faculties, that it was a work of art within an infinite frame’.

But maybe in the long run this kind of ersatz glory won’t be enough to satisfy the human soul. When asked in 1924 by a New York Times reporter why he wanted to climb Mount Everest, George Mallory famously answered, ‘Because it’s there.’ He wanted to do it for the sheer, unnecessary, existential grandeur of the thing. And yet, as Wade Davis makes clear in Into the Silence, his magnificent account of Mallory’s fatal expedition, he also did it to redeem the catastrophic failures of Western civilisation that had precipitated the First World War, which he had experienced first hand as a British army officer at the Battle of the Somme. The attempt on Everest was born out of the ruins of nineteenth-century imperialism and its high-flown civilisational ideals. The aim was both individual heroism and civilisational redemption.

So why do today’s extreme sports stars do it? Is there a wider civilisational goal, a broader moral purpose, in the name of which they perform their feats of death-defying glory? Or do they do it simply because it is there, as proof of man’s radical freedom in the face of inevitable death? Maybe for some, but there is another, infinitely more depressing possibility – which is that they do it because it’s their job. What was once an acte gratuit, performed for the simple delight of being alive and free, has become part of the global extreme sports leisure economy. Don’t get me wrong. I’m not advocating a return to empire as a way to redeem the emptiness of post-historical experience. But the tricks and stunts of the X Games crowd, while undoubtedly ‘radical’ in the extreme, can start to appear somewhat lacking in purpose. After you’ve watched the fifth round of the BMX half pipe at the X Games, the whole thing can seem kind of routine.

This is precisely one of the possible post-historical futures Fukuyama foresaw for liberal democracy: that it would become unbearably dull. Eventually, society would enter a terminal state of stability and affluence, in which the thymotic part of man’s soul would dwindle almost to nothing. Many centuries after history’s first culmination, human civilisation would slide into a further dystopian end state, the last suburb of the human soul, populated by what C. S. Lewis called ‘men without chests’. The X Games will continue, but only as a purely formalist spectacle. While the crowds will still adore the heroic performers as they somersault through the air, the magic will be gone.

But Fukuyama also imagines another, diametrically opposite future for bourgeois liberal man. There exists the possibility, he claims, that one day, far into the post-historical future, the engine of history will be kick-started back into life, not because some more advanced alternative to liberal democracy has been discovered, but because the terminally bored Last Man is simply fed up and wants some action. It is possible that, over the course of time, some small constituency of the post-historical citizenry will grow tired of having their identities weakly recognised by the liberal state and will elect instead to pursue their own perverse glory at someone else’s expense. A few of the last men will seek to become first men once more and re-enter the stream of history.

This is the existential choice that faces Patrick Swayze’s demon-eyed surf god, Bodhi, in the 1991 movie Point Break. Directed by Kathryn Bigelow and developed with her then husband, James Cameron, Point Break updated the high-concept 1980s action movie for the extreme sports and alternative rock-loving 1990s youth market. Think Top Gun with surf boards, or Days of Thunder with bleached-blond hair. The film depicts the scuzzier, more intense side of Californian surf culture than the family friendly Baywatch, a world in which meth-dealing ‘surf Nazis’ share the waves with koan-spouting hippies. It centres on the short but intense bromance between Swayze’s Bodhi and Keanu Reeves’s Special Agent Johnny Utah, a former college football star and now hotshot FBI agent, who goes undercover in Malibu’s surfing sub-culture in order to track a daring band of bank robbers, the Ex-Presidents (so named for their highly cinematic use of US president-themed face masks during bank raids).

Like Top Gun before it, Point Break positively brims with homoerotic significance. Kathryn Bigelow, a female director in the men’s worlds of movie-making and surfing, is wise to the unspoken subtexts of hyper-macho cultures. In the movie, Swayze and Reeves cement their friendship by sharing a woman. ‘What’s mine is yours,’ drawls Swayze when his protégé gets together with his old girlfriend. They also spend a large portion of the film locked in various kinds of violent embrace, as they tussle on the beach, in the city streets, or while free-falling from the sky. The crowning moment of homoerotic self-awareness comes when the identity of the bank robbers is discovered as one man, Reeves’s FBI agent, recognises the naked buttocks of another man, James le Gros’s bank robber surf dude, on a CCTV tape. What could be clearer than that?

If Baywatch is set at the end of history, a terminal beachscape where nothing ever really happens and the surf’s always up, then Point Break depicts the moment when the Last Man finally tires of his gilded cage and reaches out for something new. In Point Break, bank robbery is a logical extension of extreme sports, an intensification of the radical, anti-establishment ethos and a refinement of the qualities of courage, guile and agility that are required both for catching a gnarly tube and pulling off a daring heist. Bodhi and his gang are existential daredevils, Nietzschean super-men for the extreme sports generation, who live by their own moral law. As they barrel along the LA freeway on their way to a final heist, Bodhi schools Johnny in his system of ethics: ‘See, we exist on a higher plane, you and I. We make our own rules. Why be a servant of the law, Johnny … when you can be its master?’ With his loyal entourage of adrenaline junkies, biker outlaws and surf Nazis, Bodhi could be the leader of a paramilitary group. He’s another in a long line of dangerous white males in 1990s Hollywood cinema, which includes Michael Douglas’s angry commuter in Falling Down, Edward Norton’s angry white nationalist in American History X and Brad Pitt’s underground cult leader in Fight Club. These men are frustrated by consumerism, multiculturalism and feminism – all of the post-historical liberal virtues – and yearn instead for a bloodier, more thrilling version of history to break out all over again. In twenty years’ time, they’ll be hanging out on alt-right web platforms and voting for Trump.

The atmosphere of thymotic glory reaches its highest pitch of intensity in the film’s final scene, which takes place a year after Bodhi evaded capture during one last corpse-strewn heist. Instead of the eternal Californian sun, the scene unfolds in lashing rain as the legendary ‘fifty-year storm’ surges into Australia’s Bells Beach. As a bedraggled line of surfers make their way back up from the beach, Johnny Utah heads into the teeth of the storm. At the water’s edge, a lone figure stares into the horizon, like Ahab aboard the Pequod scanning the waters for the white whale. After a few lines of mournful dialogue – ‘I found a passport of yours in Sumatra, missed you by about a week in Fiji’ – the two elemental figures wrestle in the surf, until Special Agent Utah finally has his man in cuffs. Now we hear a helicopter overhead and police officers start pouring on to the storm-tossed beach. Bodhi’s number is up. A man can only live outside the law for so long. But at the last moment, a spark of recognition flashes across Johnny’s face as he sees the look of animal terror in Bodhi’s eyes. Bodhi begs to be granted one last wave before he is taken, one last rush before he spends the rest of his life in a cage. Johnny relents. He undoes the cuffs. Bodhi paddles out to meet his destiny in the sublime depths of the ocean. Johnny turns and walks, huddled against the lashing rain, back up towards the cliffs. As he passes, a late-arriving policeman says hopefully, ‘We’ll get him when he comes back in.’ Then the film concludes with a final line that is just as resonant, in its own way, as Humphrey Bogart’s at the end of Casablanca: ‘He’s not coming back.’

Universal human nature, the necessary arc of history, the quest for glory in the face of death – we’ve travelled a long way from events on the ground in Central Europe in 1989. It’s funny how things can get abstract so quickly, when all you set out to do is capture the mood of a particular moment in time.