12,99 €
Kieran Levis tells the stories of some of the most innovative businesses of recent times to explain how a few succeeded - when so many failed - in creating entirely new markets and dominating them. He shows how Amazon and Google rose from nothing to enormous heights, whilst IBM, Kodak and AOL plummeted from them; how Nokia and Sky bounced from near-bankruptcy to global leadership; and charts the incredible rise and fall and rise again of Apple. Told with clarity, wit and pace, these dramatic stories reveal what it was about a few winners that enabled them to hold onto their prizes, whilst the absence of these qualities crippled the losers.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Veröffentlichungsjahr: 2009
WINNERS & LOSERS
WINNERS & LOSERS
Creators and Casualties of the Age of the Internet
KIERAN LEVIS
Atlantic Books
LONDON
First published in hardback and export trade paperback in Great Britain in 2009 by Atlantic Books, an imprint of Grove Atlantic Ltd.
This electronic edition published in 2009 by Atlantic Books, an imprint of Grove Atlantic Ltd.
Copyright © Kieran Levis, 2009
The moral right of Kieran Levis to be identified as the author of this work has been asserted by him in accordance with the Copyright, Designs and Patents Act of 1988.
The author and publisher would gratefully like to acknowledge the following for permission to quote from copyrighted material: TheCrack-Up by F. Scott Fitzgerald, published by New Directions Publishing Corp ©1945, reprinted by permission of New Directions Publishing Corp; The Future of Management by Gary Hamel, published by Harvard Business School Press © 2007, reproduced by permissionof Harvard Business School Press; 'In Front of Your Nose' from Collected Essays, Journalism and Letters of George OrwellVol 4 © GeorgeOrwell, reproduced by permission of Bill Hamilton as the Literary Executor of the Estate of the Late Sonia Brownell Orwell and Secker & Warburg Ltd; Capitalism, Socialism, Democracy by Joseph Schumpeter, published by Harper Brothers ©1942, reproduced by permission of Taylor and Francis.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise,without the prior permission of both the copyright owner and the above publisher of this book.
Every effort has been made to trace or contact all copyright- holders. The publishers will be pleased to make good any omissions or rectify any mistakes brought to their attention at the earliest opportunity.
A CIP catalogue record for this book is available from the British Library.
978 184887 313 1
Atlantic Books An imprint of Grove Atlantic Ltd Ormond House 26–27 Boswell Street London WC1N 3JZ
www.atlantic-books.co.uk
To Angela, my ever-fixéd mark
Introduction 1
MARKET CREATION
1 Prodigious Partners 17
Apple
Sony
2 Capabilities and Vision 75
3 e-Merchants 86
Amazon
Webvan
4 Propositions and Discipline 113
5 Shooting Stars 125
Netscape
AOL
6 What It Takes 160
7 Network Models 170
eBay
THE BIGGER PICTURE
8 New Markets and Networks 215
9 The Disruptive PC 236
IBM
Encyclopædia Britannica
10 Creative Destruction 263
11 Wireless Winners 291
BSkyB
Nokia
12 Missing the Big Picture 329
13 The Right Stuff 355
Postscript 386
Sources and Bibliography 391
Acknowledgements 399
Index 401
Jeff Skoll was pitching a business plan to a partner at the Mayfield Fund, one of the top venture capital firms in Silicon Valley. Unlike most dot.coms in 1997, AuctionWeb was actually making money. In fact it had margins of 85 per cent and its sales were growing at more than 30 per cent every month. But the man behind the desk was baffled. 'Let me get this right, people are going to buy and sell antiques online? I gotta go.'
A few weeks later, Jeff's partner Pierre went to see two of the partners at Benchmark Capital, who were surprised he hadn't come with a slick presentation. He couldn't even give them a demonstration, because AuctionWeb's computer system was down, yet again, but they were intrigued and wanted to know more. They had no idea how big this business would become, but they started to get it. In June 1997 Benchmark placed what turned out to be the most profitable investment ever made by a venture capital firm. For $5 million it bought 21 per cent of the shares in a business that was now called eBay. Two years later its stake was worth more than $1 billion.
Nobody got eBay at first. It wasn't cool or sexy like Amazon or Yahoo. Its users were not the digerati but the kinds of people who went to yard sales. Pierre and Jeff were even having difficulty persuading programmers to come and work at such a strange business.
eBay had only become a business by accident two years earlier, when Pierre Omidyar had been toying with several website projects. He had always thought it would be cool if there were a level playing field, where people could trade with each other on equal terms. He started AuctionWeb so that computer buffs like him could buy and sell bits of old equipment to each other, with no idea of making money from it. Anyone in what he called the community could post items for sale, and anyone else could bid for them. The first item sold was a broken laser pointer, posted by Pierre himself, for which he was amazed to receive $14.
AuctionWeb was not Pierre's most important project and he paid little attention to it until his internet service provider told him that there was too much traffic on the site, and he would have to start paying the business rate of $250 a month. So in February 1996 Pierre asked the community if they'd mind contributing a percentage of the value of each sale they made. In March he was amazed to receive hundreds of cheques adding up to $1,000. By May they amounted to $5,000, and he had to hire someone to open the envelopes and take the cheques to the bank. In June, when $10,000 arrived, Pierre decided to give up his day job, and asked his friend Jeff to become his partner.
While Pierre concentrated on making the rather wobbly computer system more reliable, Jeff wrote the business plan, but he couldn't quite believe the numbers he was coming up with. Every month, without them doing a thing, revenues kept growing. It wasn't so much computer paraphernalia now – toys and dolls were selling like hot cakes. Then antiques, stamps and coins took over. There was apparently no end to the markets where eBay could make things work better.
Although nobody realized it, even in 1997, eBay was benefiting from a combination of what economists call network effects and engineers positive feedback loops – the more people who used it, the more attractive it became to others. It would continue to grow exponentially for years to come, like a giant snowball rolling down a mountain, gathering more and more buyers and sellers. By 1997 the total value of sales was on course for $100 million, and eBay's revenues were $4.3 million. Ten years later total sales reached $53 billion and eBay's share was $6 billion.
If eBay did not have the air of a winner in the mid-1990s, Kodak did not look like a loser. It was the biggest photographic company in the world, with revenues of $14 billion and a healthy balance sheet. It was one of only two companies from Forbes' list of the top hundred in 1917 that was not only still there but had out-performed the stock market index. It was a leader in research and development, with one of the best-known, most trusted brands in the world.
Its founder, George Eastman, had effectively created the industry a hundred years earlier, with his invention of roll film and the Brownie camera. He had also dreamed up a marketing message that had stood the test of time: 'You press the button, we do the rest'. One of his shrewdest insights was that there might be more money to be made from the production and processing of film than from selling cameras. Kodak became the only company that was successful at doing both, though film and printing made much more money. The high levels of research required in this industry created substantial barriers to entry, there were few global players, and fat margins financed investment in new areas.
Kodak had not rested on its laurels and had been a serious innovator, most notably in digital photography and imaging. In the late 1990s it was registering a thousand new patents a year in imaging technology and distribution. As it looked with bemused delight at the role photographic images were now playing on the World Wide Web, it concluded that a new, $200 billion 'info-imaging industry' was emerging and that Kodak was at the heart of it. In its annual report for 2000 Kodak's CEO, Daniel Carp, noted that 80 billion photos had been taken in the previous year, and 100 billion prints made. Growth was particularly buoyant outside the US and Europe. 'This is a very smart time to be in the picture business,' he declared.
It was certainly an interesting time. Kodak knew that digital photography was likely to replace film eventually and expected film sales to start to decline from around 2004. In 2000, only 8 million of the 400 million amateur photographers in the world had digital cameras, and most were not comparable with the best conventional models. A gradual transition was an entirely reasonable expectation, and Kodak was as prepared for it as anyone.
The pace and extent of change, however, took everybody by surprise. Consumers discovered that digital photography was a great deal easier, and a big part of the attraction was not having to mess about with film and prints. Despite an economic downturn, sales of digital cameras in the US rose sharply – 6.9 million in 2001 and 9.4 million in 2002. In 2003 digital sales overtook those of conventional cameras, reaching 12.4 million. The same crossover occurred worldwide in 2005, when over 42 million digital cameras were sold.
Kodak's film business's sales started to dip in 2001. At first its management attributed this to the gloom following the NASDAQ crash, and September 11. Even in 2003 Mr Carp insisted that 'our traditional film business is sound, as digital imaging continues to evolve'. But revenues from film plummeted, overall earnings fell by half, and in each of the next three years the company lost money.
Kodak could not be accused of hiding its head in the sand. It eagerly embraced a 'digital transformation' strategy, coming up with a raft of new products and services aimed at businesses and consumers. It particularly targeted the booming new market for digital cameras and in 2003 sold nearly 2 million of them, achieving the second-largest share of the American market. By 2005, it was the largest supplier in the world, earning revenues of $5 billion.
But cameras had always been a low-margin business for Kodak, and this market was to prove even tougher. New entrants flooded in – consumer electronics companies, personal computer suppliers and, astonishingly, even mobile phone makers. It seemed as though almost anybody could make these things – or easily outsource their manufacture in the Far East. The basic digital camera, a novelty a few years earlier, was becoming a commodity. The biggest supplier by volume was soon Nokia, churning out more than 12 million phones that could take pictures in 2005. These were not, of course, serious cameras, but they turned out to be good enough for many consumers. Apart from the top end of the market, digital cameras were going the way of much of the consumer electronics industry – rapid commoditization and wafer-thin margins. Although the Kodak brand provided some differentiation, the company made no money at all from its new consumer digital imaging business. In 2006 it closed it down and laid off 27,000 people, 42 per cent of its workforce.
Kodak had suffered a double disaster. The main new market into which it had plunged so boldly had turned out to be 'a crappy business', in the words of its new CEO, Antonio Perez. And the old market that had provided such reliable streams of income for so many years was disappearing faster than anyone had imagined. None of this was Kodak's fault. Nothing it could have done could have prevented the switch to digital – or turned digital cameras into a business as profitable as film. In a few brief years its world had been turned upside down.
This book describes how a handful of businesses in the last thirty years were born and rose to enormous heights – and how others fell from them. None of the new businesses would have made sense either to customers or to investors much before 1980. It was not clear at first how many of them made money. Few of them actually sold products – and those who did sub-contracted their manufacture to others. Many of them provided information or services free, or at ridiculously low prices. Yet nearly all of them mushroomed from nothing to revenues of billions in just a few years.
They did this in a new economy – globally integrated, electronically networked and ferociously competitive. Thousands of new technologies emerged, millions of new businesses sprang up, but most failed to find a market or were quickly overtaken by others. Established businesses, accustomed to stability and continuity, found this brave new world distinctly uncomfortable, with unfamiliar competitors snapping at their heels and disruptive technologies and bizarre business models threatening to marginalize them. Illustrious names like IBM, AT&T, GEC and Encyclopædia Britannica came humiliatingly close to extinction. It seemed as though business life had become a brutal Darwinian struggle where no company had any security of tenure.
Yet a select few went from rags to undreamt-of riches in just a few years. In 1984 a married couple, moonlighting from their day jobs, started assembling networking equipment in their living room, and called their part-time business Cisco; for a few giddy weeks in 2000, Cisco was the most highly valued company in the world. In 1990 Nokia was an unwieldy 115-year-old Finnish conglomerate with 187 businesses, in deep financial trouble; ten years later it was the global leader of a new mobile telephone industry. Google only became a business in 1998 and earned virtually nothing for the next two years, but by 2007 it was grossing $16.6 billion.
How on earth did they do it? What was it about the winners described here that enabled them not just to define a new way of doing business but to dominate the markets they had created? What qualities did they have in common? And how did they manage to stay on top of the heap? What does it take to become the long-term leader of a new industry in today's economy? How were they able to fight off waves of challengers, when so many others were toppled?
The answers to these questions lie in the stories of some notable recent successes and failures. These show how and why a small number of very different organizations were so successful, and the unusual attributes they had in common, and why the absence of these qualities crippled those who failed. This does not pretend to be science, but these are not questions that rigorous, quantitative analysis can ever answer – it played little part in the birth of most of the businesses described here. Much of business, like much of life, is inherently unpredictable, untidy and uncertain. Managers can reduce the uncertainty but never eliminate it. In particular, as all these stories demonstrate, they can never know or control the future. All of the outstanding successes defied the conventional wisdom of experts – and enjoyed a considerable amount of luck.
Mortality, however, is unavoidable – not many businesses last more than a few years, and only a tiny number as long as a human life. In Joseph Schumpeter's sobering words, 'all successful businessmen are standing on ground that is crumbling beneath their feet.' The process of creative destruction he identified, whereby businesses are constantly challenged and eventually displaced by the innovations of others, has been going on for centuries, normally slowly and imperceptibly. In the age of the Internet the process has been speeded up and intensified. Globally integrated markets and the revolution in information and communications technology have led to vastly more innovation in new products, services and business models, but also to much more competition and disruption. Some recent financial innovations led to a crisis in 2008 that shook the global economy to its foundations. Creative destruction always produces more losers than winners, though in the long run most consumers are modest beneficiaries.
Winners and Losers aims to describe and explain, not to prescribe or predict. Most of the describing is done in the twelve detailed stories that make up the bulk of the book, most of the explaining in the chapters of ideas and arguments that follow each pair of stories.
In many languages, history and story share the same word. Human beings have been telling each other stories since prehistoric times, and sometimes learning lessons from them. We are still fascinated by Homer's heroes thousands of years after he brought them to life, because they are both extraordinary and recognizably human. Five hundred years ago Machiavelli showed how international politics really worked by describing how men like Cesare Borgia won power and exercised it. Peter Drucker and Alfred Chandler took a not dissimilar approach to describing the business corporations of the twentieth century. Their works are still widely read today because they explain so much.
The point of the stories here is to help us to understand how these markets work and, above all, why a few succeeded where so many failed. Ideas are particularly important in the new economy – knowledge and innovation are its lifeblood. All the new businesses described here started out with an idea that hardly anyone took seriously at the time, yet that was the seed from which not just a business but an entire market grew. Concepts like creative destruction, disruptive technologies and positive feedback are crucial to understanding the dynamics of these markets, but few executives are familiar with them, let alone general readers.
This book does not pretend to show business executives how to become winners themselves – there are more than enough of those already. It is intended for anyone who is interested in understanding how markets are created and disrupted, and why some winners seem to take all the prizes. Some ideas add less to our understanding than others. Catchy phrases like first mover advantage, winner takes all, content is king, get big fast and Web 2.0 contain a grain of truth, but have led some seriously astray. Change is confusing and it is tempting to clutch at soundbites for simple explanations. Unfortunately, they invariably oversimplify and encourage the belief that there is one big idea that can explain everything, that if we get one key thing right everything else will fall into place. These stories show that this is never the case – but that some patterns are almost universal.
The fundamental problem all businesses face is how to adapt to apparently sudden changes in their environment. Making sense of them means taking a broader view than most businesses or business books do. Business is not a passion-free realm, separate from the rest of human life, and the humanities can shed as much light on it as economics, mathematics and evolution.
History offers not so much clear-cut lessons as perspective. Without some knowledge of the past, it is impossible to understand the present, let alone speculate sensibly about the future. History shows us that change is constant, though not always obvious, and rarely foreseeable. Though history never really repeats itself, as Mark Twain remarked, it does often rhyme. A globally integrated economy of sorts existed in the decades leading up to 1914, and steamships, railways and the telegraph in the nineteenth century shrank distances every bit as much as the telephone, the jet plane and the Internet in the twentieth. The fact that two world wars and the Great Depression shattered that era of peace and prosperity is a grim reminder that progress is far from inevitable.
Traditional economics, with its fixation on theoretical states of equilibrium, has its limitations in explaining how markets are created and disrupted, but it does provide 'a box of tools', a framework of near certainties: if costs and prices fall, sales will increase; successful business pioneers will attract competitors; competitive markets will eventually see prices fall close to marginal cost.
It is the economic consequences of technological innovation, particularly dramatic changes in costs, which transform markets, industries and societies. The biggest single factor in the growth of a global economy over the last two centuries has been the massive fall in the cost of transporting, first, people and goods and, latterly, information. Since the flourishing of the Internet, the marginal cost of transmitting information has fallen close to zero. The significance of Moore's Law (that the number of circuits that can be packed on to a chip doubles every eighteenth months) is not just that computers get ever more powerful, but that they become ever cheaper and more affordable by ever more people.
In the late 1990s some economists thought that digitization was creating a 'new economy' where old rules no longer applied. Capitalism is of course constantly mutating – that, as Schumpeter showed, is its nature. In the 1980s it went through two revolutions, one technological, one political-economic, leading to a radically different business landscape from the nationally protected, manufacturing-dominated economy of most of the twentieth century. The most striking differences are the abundance of new technologies and business models and the greater value attached to intangible assets like knowledge, brands and human capital. Less obvious is the ever larger role that networks of different kinds play in making this new business landscape more complex, more interconnected and more volatile, and how feedback loops speed up both growth and decline. eBay and Microsoft enjoy lucrative network monopolies, but most businesses find lasting competitive advantage more elusive than ever before.
To understand the transience of success, the tragic end that awaits so many businesses, and the limits to what we can really know, philosophy and literature may be our best guides. Isaiah Berlin has shown that in human affairs there are no universally applicable theories or formulae, no single right answer to complex questions. Shakespeare has more to tell us about the slings and arrows of outrageous fortune than any business book, and Homer more about hubris.
Some would say that trying to explain lasting business success is itself hubristic. Books like In Search of Excellence, Built to Last and Good to Great have been the targets of much academic scorn, partly because they relied on stories, but more tellingly because many of their winners later floundered. That, however, only goes to show that no-one is immune from the ravages of age and creative destruction, and does not make the quest futile or the conclusions entirely wrong. There is wisdom in all these books, notwithstanding Jim Collins's conviction that his research for Good to Great was scientifically rigorous and that he was discovering 'the enduring physics of great organizations'. Promises of 'timeless, universal answers that can be applied by any organization' are doomed to disappoint – if there were such answers, everyone could be a winner. Exceptional organizations have exceptional attributes and assets, the foundations of their competitive advantage. And even they cannot hold on to it for ever.
Winners and Losers aims simply to identify the qualities shared by all the market creators examined who went on to establish industry leadership that endured for more than a few years. (Most managed a decade or so.) These do not constitute a formula for lasting success, but the conditions necessary for it. External factors, notably customers and competitors, are crucial. And success, especially the lasting sort, is always relative.
For some people the problem is not so much business books as business itself, and the whole idea of winners and losers – one of the things they find so distasteful about capitalism. The term is used here mainly to describe businesses that had enormous wins or losses – in some cases both. Most of the losers described here either had reasonable hopes of hitting the jackpot or, like Kodak, had previously been long-term winners. Many of its workers were tragic losers from creative destruction – when an industry is wiped out, entire communities can find themselves on the scrapheap of history.
In a constantly changing competitive economy, nobody can be a winner for ever – incumbents are always challenged, and eventually displaced, by new forms of competition. None of the winners described here possessed all their attributes all of the time – many of them they learned, often from making mistakes, and some they forgot. Their stories show that contemporary capitalism is more diverse than its more sweeping critics might think, and that imagination, idealism and the search for excellence sometimes play a bigger role than avarice in the birth of businesses. In scarcely any of the new companies described here was financial gain the main goal of the founders – in some cases making shed-loads of money came as a delightful bonus.
There is no shortage of nasty businesses to confirm the suspicions of convinced anti-capitalists. Many companies as they get bigger put short-term financial performance above all other considerations, and rely more on locking customers in than on earning their lasting loyalty. Big companies may execute with chilling efficiency, but they frequently lack humanity and open-mindedness. Success and incumbency easily breed self-satisfaction, arrogance and hubris, but markets eventually bite back. In an economy where the skills and knowledge of employees and close relationships with customers and suppliers are critical to success, it does not pay to rest on one's laurels or to treat any stakeholder cavalierly.
The exceptional companies who establish lasting industry leadership are generally good at, among other things, nurturing human capital, cherishing customers and building mutually beneficial relationships with suppliers. The greatest single contribution to surviving the gales of creative destruction is the ability to continue to do better what the business already does well – enhancing organizational capabilities and learning new ones. This is what inspires the people who work in consistently innovative companies, and delights customers – and ultimately therefore shareholders.
Self-improvement and consideration for others do not of course guarantee success – unfortunately nothing does. But the long-term costs of complacency, of squeezing the last penny out of customers and suppliers, and the last ounce of effort out of employees, sometimes outweigh the short-term gains.
There is no overriding key to success in new markets, apart perhaps from luck. The combination of attributes required is highly unusual, which is why very few become big winners. Lasting competitive advantage and industry leadership call for an even more demanding set of qualities. A few organizations manage to master all of them, for a time at least. This book is mainly about them.
The book is organized around the stories. Each odd-numbered chapter, from 1 to 11, contains a pair of detailed stories and is followed by a generally shorter, even-numbered chapter that explains the ideas the stories highlight, and why qualities like distinctive capabilities and disciplined entrepreneurialism are critical to the success of all market creators.
The first half of the book concentrates on how new markets are born and on the success factors for those who created them. In between the stories of Apple and Sony, Amazon and Webvan, AOL and Netscape, eBay and Google, we examine, in chapters 2, 4 and 6, the eight attributes shared by all market creators.
The second half looks at the bigger picture. Chapter 8, 'New Markets and Networks', considers what makes new markets different from mature ones and how networks, both physical and virtual, make recent ones even more different. Chapter 9, 'The Disruptive PC', describes how IBM and Encyclopædia Britannica, Inc. were almost destroyed by the PC revolution and the failure of their leaders to understand the challenges they faced. Chapter 10, 'Creative Destruction', discusses the constant evolution of the business landscape and the ways in which markets are disrupted by new technologies, new forms of competition and other discontinuities. Chapter 11, 'Wireless Winners', tells the stories of two masters of creative destruction, BSkyB and Nokia. Chapter 12, 'Missing the Big Picture', considers why it is so difficult to recognize, understand and come to terms with radical change.
Finally Chapter 13, 'The Right Stuff', considers the really big question – what does it take to achieve lasting competitive advantage in a new industry. It examines the eight qualities shared by those organizations who held on to long-term industry leadership – significantly different ones from those for market creators.
Our protagonists are organizations, but they are made up of people and share several qualities with Homer's heroes – audacity, ingenuity, determination – and often hubris, wishful thinking and refusal to face uncomfortable truths. They are also remarkably diverse. The values and culture of an aggressive, hard-selling company like Dell or Rupert Murdoch's buccaneering BSkyB could scarcely be more different from those of the idealistic founders of eBay or the high-minded mathematicians at Google, intent on doing no evil and organizing the world's knowledge. Yet these four shared some very rare qualities, as did all the outright winners. These attributes, and other patterns found in many of these cases, cannot be shoehorned into anything resembling a scientific theory. There is never a single explanation for success or failure.
The criteria for selecting subjects were that they could genuinely be said to have created a new market (or tried to) or that they lost leadership to a new form of competition, and that there were sev eral independent, reliable sources of information about them – companies' own accounts are invariably selective and bland. Successes do not lack chroniclers, but failures tend to be glossed over. Fortunately the early years of Netscape, Webvan and AOL have been almost as well documented as those of Amazon, Google and Nokia. The failures of heroes are particularly instructive, and I have rather unkindly concentrated on the unhappier episodes in the mostly illustrious histories of Apple, IBM and Sony.
Of the twelve organizations described in depth, five – Amazon, BSkyB, eBay, Google and Nokia – succeeded both in creating new markets and in establishing long-term leadership of an industry or sector. Three – Apple, IBM and Sony – were both winners and losers: each created several new markets and made themselves lasting leaders of most of them, but they also knew bitter defeats. Four – AOL, Encyclopædia Britannica, Netscape and Webvan – were ultimately losers. The stories of many other organizations are told more briefly.
Microsoft, one of the outstanding winners of the era, often through invading markets others had created, is not treated comprehensively, but plays an important supporting role in several of the company stories, and not always as the villain. There are few out-and-out villains here – none of these ventures was entirely without merit. And none of the heroes is flawless or immortal.
Most of the markets discussed took off during the 1990s, and with the benefit of hindsight we can make better judgements than were possible at the time. In a few, very recent, cases judgement needs to be particularly provisional. The stories of Google and the iPod thread in Apple's go up to about 2007, because it was in this decade that these markets took shape. The others concentrate mainly on earlier periods, where more perspective is possible.
The stories are not assessments of the current competitive positions of the companies profiled. Indeed, many of the winners are now mature, and inevitably have lost some of the qualities that made them great market creators. None of them is invulnerable to new kinds of competition or to the perils of rigid thinking, and we can expect several of them to lose leadership over the next few years. Dell appeared to have done so at the time of writing, and others have had setbacks. They will not be the only ones – that is the nature of creative destruction.
Apple and Sony are the most talented market creators of modern times and their stories make an intriguing contrast. Like Google, which resembles them in other respects, each was founded by two talented partners, passionate about technology. Apple was inspired by Sony in its early days, but many years later became a formidable competitor to it. These stories concentrate on these companies' failures, but their achievements were truly heroic.
Apple started with a friendship that became a business. The business took off like a rocket, but the friendship sadly died. The two young men who founded Apple Computer first met in 1972, when Stephen Wozniak, invariably known as Woz, was twenty, and Steven Jobs was sixteen. They had grown up in the area south of San Francisco that was to become known as Silicon Valley. Both were outsiders, with few friends or shared interests with their schoolmates and limited social skills. Woz had already dropped out of college and Jobs was to do so later.
They were fascinated by electronic circuitry, in Woz's case to the point of obsession. From an early age he had shown an extraordinary talent for designing gadgets, and a taste for juvenile practical jokes. One of their wheezes was a device for making free (i.e. illegal) phone calls, which Woz produced and Jobs sold.
Early in 1976, mainly to impress other members of the Homebrew club of amateur electronics enthusiasts, Woz designed and built the circuitry for what was to become the first Apple computer. It was a characteristically maverick solo achievement, based on a microprocessor whose chief merit was that Woz could obtain it for $20. Like all his designs it was stunningly simple and elegant, so it could be manufactured easily and cheaply, but economics were not uppermost in Woz's mind. He proudly gave away the schematics to other members of the club.
Steve Jobs turned out to be equally brilliant, but in very different ways. His genius was for superb design and for developing and selling ideas, for getting others to share his extraordinary, egotistical vision, which some later called his 'reality distortion field'. Self-taught, he would become the most brilliant, intuitive marketer of his generation. His boundless self-belief and energy bulldozed the unworldly, reclusive Woz into turning his invention into a business.
Without each of them it would never have happened. Wozniak single-handedly designed the first two Apple computers, but it was Jobs's drive, vision and hustling that pushed the business forward. They both liked to attempt the seemingly impossible. Woz delighted in leaving everything to the last minute and then hurling himself into frenzied bursts of creative activity, going without sleep for days on end. Yet he produced astonishingly stylish designs with fewer parts than anyone else had previously imagined. Jobs revelled in chivvying others into pursuing his dreams of perfection, in never taking no for an answer, not least from his partner. It took months of pressure to convince Woz to commit himself to the business and even longer before he would give up his modest day job at Hewlett Packard. His family also had reservations about his going into partnership with this bumptious, manipulative and none too scrupulous youngster.
Jobs justified himself by winning their first big order. Their original idea was to make a hundred boards at $25 each and sell them to their fellow hobbyists for $50. Much to their amazement the Byte Shop offered them $25,000 for fifty machines. This was the spur that made it all suddenly much more than just a sideline. Jobs's problem then was to persuade sceptical suppliers to give them thirty days' credit for the necessary parts, and to organize 'manufacturing'. When they were thrown out of Woz's family house, they decamped to Jobs's parents' home. Running out of time and money, he got his pregnant sister and friends to help with the assembly and, working day and night, they managed to fulfil the order on time. When Jobs proudly took the first ones to the Byte Shop, he discovered that the pro prietor had expected fully assembled computers, with keyboards and software, not just boards. Somehow Jobs managed to bamboozle him into making payment in full.
This early success enabled Jobs to hire more people, but he soon had to borrow more money, and the business existed largely hand to mouth. In their first year, 1977, they only sold 150 machines. While Woz's enhancements, like a colour display that he integrated into the microprocessor, were major innovations that paved the way for a whole PC on a single board, they did not do much for immediate sales.
Jobs realized that if Apple was to survive as a business, let alone advance, it needed a new model – and to be a serious, properly financed business. If their new machine, the Apple II, was to get beyond the tiny hobbyist market, it would have to be a professionally produced and marketed product. The machine itself would need to be completely self-contained, with its own operating system, power supply and keyboard, and housed in an attractive case. As he put it subsequently, 'The real jump of the Apple II was that it was a finished product. It was the first computer that you could buy that wasn't a kit… You didn't have to be a hardware hobbyist with the Apple II. That's what the Apple was all about.'
At this stage, Jobs knew little about marketing, but he had a nose for who did. He wanted Regis McKenna's agency, which was handling a stylish advertising campaign for Intel, the creator of the microprocessor. McKenna was the top marketing strategist in the Valley and would normally not consider a tiny, start-up outfit as a client.
Undeterred, Jobs made dozens of calls until he finally got them a meeting with the great man, but when McKenna suggested that Woz avoid making a magazine article too technical, he told him he wasn't having any PR guy mess with his copy. Jobs managed to smooth things over, but it took a lot more phone calls to persuade McKenna to take the account.
What swung it was McKenna's conviction that the future of electronics lay in applications aimed at non-technical customers, not raw technology. He had campaigned hard within Intel for the potential of the microprocessor and a shift towards selling devices as products.
McKenna was to play a big part in Apple's early development, as was another experienced businessman he introduced to Woz and Jobs. Don Valentine was a tough venture capitalist, who had been an executive at leading semiconductor companies, and who would later fund Cisco, the maker of Internet infrastructure. Initially Valentine, like McKenna, was none too impressed by these two naive, arrogant, scruffy kids, now working out of a garage. They did not look remotely like the kind of management team VCs liked. As he put it to McKenna, 'Why did you send me these renegades from the human race?' They did not even know what a business plan was and had no strategy at all. But he felt they had something interesting and told them that he would consider investing if they brought in someone with marketing experience. After a week of Jobs making three or four calls a day, Valentine introduced him to a former colleague, Mike Markkula.
Markkula had been a marketing manager at Intel who had picked up several million dollars in founders' stock. He had retired, aged thirty-three, and was enjoying a relaxed, civilized lifestyle which he did not intend to change too much. He had, however, long been convinced that there was soon going to be a mass-market breakthrough with microprocessors. The Apple II, he decided, could be that breakthrough, and a fascinating sideline: not only would he mentor the two Steves, but he would guarantee Apple a loan, reducing the immediate need for seed capital. In return he would take a third of the equity. However, Markkula was not interested in running the company. He would be the chairman and advise on marketing. He persuaded another old friend from the semiconductor industry, Mike Scott, to become president and effective CEO.
Scott was a tough, hands-on manager of manufacturing operations, someone who made things happen. Jobs was always ambivalent towards him but knew that he was not yet ready for the top job. As Scott owned much less stock than the other three principals, the dynamics were always rather strange. Jobs, as heir apparent, felt free to get involved in any aspect of the business and nibbled constantly at Scott's authority. His main role was evangelizing and achieving distribution, which he did brilliantly, ensuring that the Apple II had what was then the best distribution system in the young PC industry. Tensions, however, were frequent.
Scott respected Jobs's vision and intellect: 'The great thing about Jobs was that you always understood where you stood with him. He never said what you wanted to hear. His positions were well thought out, and he always told you where he was coming from. It could be stressful, but the trick was not to take it personally.' However, it was also clear that 'he cannot run anything. He doesn't know how to manage people. After you get something started he causes lots of waves. He likes to fly around like a hummingbird at ninety miles an hour. He needs to be sat on.'
This was the view of most people who worked with Jobs – apart from those who simply adored or loathed him. Although Jobs's vision and charisma were inspirational, he was cordially disliked by many engineers for his arrogance, insensitivity and passing off of other people's ideas as his own. On the other hand, his intuition and business judgement were often excellent. Scott thought that the ninety-day warranty standard in the electronics industry would be good enough for personal computers. Jobs argued passionately that a year would be needed to build trust with customers. After bursting into tears, he finally won the argument.
Initially, though, Scott's biggest problem was Woz. The Apple II could sell by the thousand, but they were totally dependent on Woz to complete its designs. 'Woz was very, very creative, but it came in spurts. And he would cover himself during the in-between times. He'd never say that he wasn't making progress on a project. Instead, he'd say everything was coming along just fine, and meanwhile he would be waiting for the little light in his brain to switch on and save him.'
Woz was the idol of the technological buffs, particularly the many talented engineers who joined Apple, and created, virtually singlehanded, the first two Apple computers. He was not just a brilliant designer of elegant circuitry – he also wrote a new programming language for the Apple II. Knowing nothing about disk drives, he designed a completely new approach to the interface: ignoring the work of dozens of IBM engineers on synchronicity, Woz dispensed with the problem by simply holding data in a cache. His overall design for the Apple II, with an unheard-of 62 chips on the board, was applauded by the entire industry as an engineering masterpiece. California magazine proclaimed him 'King of the Nerds'.
The techie view of Jobs, shared to some extent by Woz and by many commentators repelled by his often appalling behaviour, was that he was a parasite on the real inventors, his main contribution to the Apple II being its elegant design and casing. With the benefit of hindsight, it is clear that his role was infinitely more important. Jobs imprinted his personality on Apple and its products like a Hollywood producer, with monumental ego and screaming tantrums to match. He gave the company its stylish, hip, rebellious image. His passion for great design and his perfectionism can be seen in virtually every Apple product in which he had a hand. He was obsessed with how customers would see the product and feel about it. The engineers who did more of the supposedly real work resented his approp riation of their creations, but there seems little doubt that, in his irritating gadfly way, he inspired Apple to produce 'insanely great' products.
McKenna carefully burnished the image of Jobs as the personification of Apple and created a climate of approval in the press and among opinion-formers. Apple's audience became young professionals more interested in how easy the computers were to use than their technical wizardry. They loved Apple's coolness and identified with Jobs.
However, the person who actually built the organization, created the administrative, manufacturing and financial infrastructure, and ensured that products got out of the door was Mike Scott, who imposed a modicum of order on Apple's often chaotic culture. Valentine, never lavish in his praise, called it one of the best pieces of execution he'd ever witnessed.
Launched in 1977, the Apple II was the company's greatest commercial success until the arrival of the iPod more than twenty years later. By 1978 it had become the leading personal computer in a market until then made up mostly of hobbyists. In its first year Apple sold 8,800, in the next, 35,000, and by 1980, 70,000, taking Apple's revenues to $118 million. Eventually it sold more than 2 million and was Apple's cash cow for most of the 1980s.
Its take-off was considerably helped by the appearance in 1979 of VisiCalc, the first spreadsheet program, produced by a small software developer. This gave the first, tentative business customers a reason for buying one of these strange new toys. Spreadsheets were not just aids to preparing budgets and business plans, but dynamic models that could almost instantly show the effect of changes which previously would have taken hours of laborious calculations. VisiCalc was also the first serious personal computer software designed for non-engineers. There were soon hundreds of independent software developers producing programs to complement the Apple II.
Another organization was to have an even more decisive influence on Apple's future – and on how we all use computers. In 1979 Jobs visited Xerox's legendary research centre, PARC, and what he saw there 'blew his mind'. PARC had developed some concepts for personal computing – a 'graphical user interface', icons and the mouse – that we now all take for granted. These were far too strange for Xerox's management to pursue, but Jobs saw at once that they could transform how people used computers. Apple signed a licensing agreement with PARC and recruited several of its people.
The concepts were first incorporated into Apple's most ambitious new project, the Lisa, which Jobs championed personally. No fewer than three rival projects began in 1979, when Apple spent $7 million on research and development, bumping this up to $21 million in 1980, and $38 million in 1981. Jobs, however, had eyes only for Lisa, and loudly disparaged other teams, notably that of the II.
The Apple III was rushed out hurriedly in 1980. Intended as the successor to the Apple II, it was severely limited by the need to use the same Motorola processor and to be able to run all the software used on Apple II. There were big conflicts between marketing and engineering and major compromises on quality and reliability. It was Apple's first failure.
Fortunately, it attracted comparatively little public attention. The triumphant success of the Apple II and the enormous media interest in the young company led directly to an early stock market flotation. Valentine had previously organized some private placements of capital and sold his own stake for a large profit in August 1980. The Initial Private Offering (IPO) in December 1980 was the most oversubscribed in twenty years. In August the stock had been valued at $5.44, but on 12 December it opened at $22 and closed the day at $29. The company, which had only been incorporated three years earlier, was valued at more than a billion dollars. Over a hundred of the now 1,000 employees became millionaires overnight.
The excitement leading up to the IPO, and the euphoria after it, went to many people's heads and contributed directly to some of Apple's subsequent misfortunes. Its timing had dictated the rushed launch of the Apple III earlier in 1980. It was quickly followed by some disastrous decisions, and the loss of Woz and Scott.
Most of the new millionaires satisfied themselves with buying a Mercedes or a Porsche, but Woz was one of those who decided on a plane. In February 1981, before he had obtained his full pilot's licence, he crashed it and was in a coma for a week. He did not come back to work for nearly three years and never produced another computer for Apple.
It was also clear by now that the Apple III had been a disaster and Scott decided that heads must roll. He was probably right to conclude that Apple had acquired too many people in its headlong expansion and had got into sloppy ways, but his handling of the purge grated badly. He first demanded eighty dismissals, including the head of engineering. His colleagues baulked at this but finally agreed on forty. Black Wednesday caused almost universal outrage – Apple was not supposed to be that kind of company and Scott became intensely unpopular. A month later, while he was away on a long weekend break, Markkula, Jobs and other senior managers took what was probably a worse decision – to fire Scott himself. Markkula took over as nominal CEO, but much of the power was now with the 25-year-old chairman. The children had revolted against the grown-ups and turned their backs on disciplined management.
This kind of coup was to become a pattern at Apple for the next sixteen years. Every subsequent leader, from Jobs himself in 1985 to Amelio in 1997, was ousted in similar fashion following plots by their closest colleagues. Most of the firings were justified, but they never produced better leaders. On each occasion Apple failed to find a truly professional CEO with a coherent strategy for the company and the ability to get it to pull together. Scott may not have been the man to lead Apple to the next stage but neither were any of his successors until the return of the older, wiser Jobs in 1997.
Many years later, McKenna acknowledged that the defenestration of Scott had been a terrible mistake:
Mike Scott was a tough and demanding boss. But he was also intent on putting a systematic decision-making process into place at Apple. And Apple got rid of him. Looking back, he was Apple's last chance to institute some kind of order. After that, the culture became so overwhelming that even the toughest manager would come in to shake things up – and instead find himself two months later lounging on a beanbag… The mistake everyone makes is assuming that Apple is a real company. But it is not. It never has been.
Markkula and Jobs, however, were so pumped up by the IPO and all the adulation the company was receiving that they took another terrible decision. They decided that all these third-party developers were parasites making money out of Apple's brilliant innovations. Instead of recognizing that the relationship was genuinely symbiotic, they decided to discourage the parasites. Apple would from now on try to produce everything it could itself, starting with its own spreadsheet program, and moving on to disk drives and keyboards.
Fortunately Apple was not consistent about this, and later actively recruited developers for the Mac, but it gradually lost much of the army of independent developers that had played such a big part in making the Apple II so attractive to customers and was one of its most valuable assets. (A much larger army would soon do the same for the PC.) Trying to do everything in-house added yet more to Apple's rising costs, since it could never be the most efficient producer of every component. Most dangerously of all, it reinforced the belief that nobody outside the company could do anything better than Apple.
For its first four years, Apple had faced little serious competition. All the other personal computer makers made machines that for non-technical professional users were not real alternatives to the Apple II. This brief and entirely atypical period fostered the dangerous illusion of invincibility.
The arrival of IBM in the PC market in 1981, with its enormous financial and marketing muscle, changed everything. Apple cheekily ran a full-page ad in the Wall Street Journal saying, 'Welcome IBM. Seriously.' Privately it scoffed at the ordinariness of the new PC. Bill Gates was visiting Apple when the PC was announced and remarked that 'they didn't seem to care. It took them a year to realize what had happened.'
It was IBM who opened up the corporate market. It sold 240,000 PCs in 1982, and in 1983, with the more powerful XT in its port folio, 800,000. This meant it had displaced Apple as the leader, though in reality the two companies were addressing different markets. IBM had the best sales force, the greatest reputation, the closest relationships with its customers of any company in any industry, not just computing. It was at the peak of its powers – nobody then foresaw its abrupt decline a few years later. All that Apple had were some nice products and early adopter customers. Its zaniness and anti- corporate attitudes, while good for its image among the young at heart, positively repelled organizations dubious about the very idea of purchasing PCs at all. IBM represented security and reliability.
For Apple, IBM represented the bad old corporate world that it was going to topple. As its own sales topped a billion dollars in 1983 and profits reached $79 million, it was still riding on the crest of a wave. It had not the slightest doubt that its brilliantly designed products would prevail over dull mediocrity and conformism and that its main weapon against Big Brother would be the flagship product it had been working on since 1979, the Lisa.
The Lisa was undoubtedly an immensely innovative machine, the precursor to the powerful workstations that Silicon Graphics and Apollo were soon to produce – but for engineers, not businessmen. The problem with the Lisa was that it was like a Ferrari in a market of Model-T Fords. Computer buffs and aesthetes loved it, but for corporations it was too unusual and, at a minimum price of $10,000, far too expensive. It flopped badly and Apple's profits in the quarter to September 1983 slumped from $25 million to $5 million.
Fortunately it seemed that a new champion was at hand, and one that was now Jobs's own baby.
The Mac had initially been developed on the fringes of Apple. Its original creator was Jeff Raskin, another technical genius, whom even Woz acknowledged as a peer. Raskin had long dreamed of a computer as easy to use and as inexpensive as a Swiss Army penknife, a People's Computer. Sadly for Raskin, Jobs, who had previously disdained the Mac, became a convert after he had been pushed off the Lisa team. The Mac, he realized, could be the Apple II of the 1980s, a 'friendly' computer. He worked his way into the development team and soon worked Raskin out.
His vision for the Mac was more ambitious and owed not a little to his earlier dreams for the Lisa. The Mac 'would make a dent in the universe' and be acclaimed as a superb piece of design. However egotistically Jobs may have behaved, there is no doubt that it was his leadership that made the Mac the product it became, more powerful than Raskin had intended (though not enough) and incorporating the first mouse. His silly slogan, 'it's better to be a pirate than join the navy', struck a chord with most of the team and working on the Mac was the most intense, traumatic and exhilarating experience most of them ever had. They thought of themselves, with some justification, as artists. John Sculley remarked later, 'It was almost as if there were magnetic fields, some spiritual force, mesmerizing people. Their eyes were just dazed. Excitement showed on everyone's face. It was nearly a cult experience.'
The Macintosh was a truly outstanding creation, worthy of its appearance in the Museum of Modern Art. It was announced with one of the most arresting television commercials ever seen, made by Ridley Scott, the director of Blade Runner. Shown only once, in January 1984 in the middle of the Superbowl, it depicted a thinly disguised IBM as the soulless, totalitarian Big Brother of Orwell's novel, proclaiming 'the unification of thought' to the massed ranks of grey, sleepwalking prisoners. Suddenly a dashing young woman runs through their ranks, throws a hammer at the screen and smashes it. 'On January 24, Apple Computer will introduce Macintosh. 1984 won't be like 1984.' Quite what this meant was not clear to everyone who saw the ad, but it certainly caught their attention, was replayed on other programmes, and seen by 43 million people.
Two days later at the official launch, Jobs added his own dramatic flourishes:
It is now 1984. It appears that IBM wants it all. Dealers, initially welcoming IBM with open arms, now fear an IBM-dominated and controlled future. They are increasingly turning back to Apple as the only force that can ensure their future freedom.
Will Big Blue dominate the entire computer industry, the entire information age? Was George Orwell right?
To massed shouts of 'No', he unveiled the Mac.
The Macintosh was completely different from any PC that most people had ever seen. It embodied ideas that visionaries like Douglas Engelbart had expressed in the 1960s for a machine 'for the augmentation of man's intellect', complete with the now familiar mouse and windows. The Mac was the first real product inspired by the prototypes that the engineers at Xerox's PARC centre had been developing for years. Its graphical operating system was breathtakingly intuitive and user-friendly. It was to be another eleven years before ordinary PCs had a desktop interface that was comparable, when Windows 95 finally arrived, incorporating many features from the Mac operating system. Everyone who played with the Mac fell in love with it – including Bill Gates.
Unfortunately, not nearly enough people bought it. Sales for the year were 250,000, only half of the target Jobs had boldly set, and a sixth of the number of PCs IBM sold in 1984. For the corporate market, the Mac was too expensive, initially not powerful enough, with woefully inadequate memory, no hard disk and a feeble software library. Most of these flaws had been pointed out before, but Jobs was adamant that it was ready. As his lieutenant, Bill Atkinson, acknowledged later, 'In our efforts to change the world we were a little arrogant and unwilling to listen.'
Over the next two years, the problems were gradually fixed and subsequent versions of the Mac attracted passionate devotion. However, the cult that developed around it pointed to a strategic weakness for a product aimed at knowledge workers: it was simply too different, too quirky to appeal to a mass market quickly. It was a great product for early adopter consumers who could afford it and for niches like desktop publishing and education, but not remotely appropriate for taking on IBM and winning corporate customers. It seemed to IT managers more like a toy than a real man's machine.
Apple took a long time to recognize both this and the fact that selling to businesses was about a lot more than having great products. By 1984 there were more than a hundred manufacturers of clone PCs, most of them slashing their prices every few months, as Moore's Law kicked in. The vast majority never made a profit and quickly disappeared. Soon, not only was IBM itself to be eclipsed by Compaq, Hitachi and the rest of the pack, but Apple was up against a vast new, low-cost industry. Its chief adversary, however, became the owner of the operating system on all these new machines, Microsoft. The ever-growing number of people and organizations using first DOS, then its successor, Windows, and the number of developers producing software for the new standard, represented a tide of network effects that became irresistible.
