30,99 €
Be prepared for the arrival of automated decision making Once thought of as science fiction, major corporations are already beginning to use cognitive systems to assist in providing wealth advice and also in medication treatment. The use of Cognitive Analytics/Artificial Intelligence (AI) Systems is set to accelerate, with the expectation that it'll be considered 'mainstream' in the next 5 - 10 years. It'll change the way we as individuals interact with data and systems--and the way we run our businesses. Cognitive Analysis and AI prepares business users for the era of cognitive analytics / artificial intelligence. Building on established texts and commentary, it specifically prepares you in terms of expectation, impact on personal roles, and responsibilities. It focuses on the specific impact on key industries (retail, financial services, utilities and media) and also on key professions (such as accounting, operational management, supply chain and risk management). * Shows you how users interact with the system in natural language * Explains how cognitive analysis/AI can source 'big data' * Provides a roadmap for implementation * Gets you up to speed now before you get left behind If you're a decision maker or budget holder within the corporate context, this invaluable book helps you gain an advantage from the deployment of cognitive analytics tools.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 632
Veröffentlichungsjahr: 2018
Cover
Title Page
Acknowledgements
Preamble: Wellington and Waterloo
Introduction
Notes
Prologue: What Do We Mean by Work?
Summary
Introduction
Slavery or Freedom?
The Rise of Industrialisation
Gen Z and the Flat White Society
The Impact of Unemployment
Replacing the Need to Work
Conclusion
Notes
CHAPTER 1: Introduction to Analytics
Summary
Introduction
Business Intelligence
Advanced Analytics
Prescriptive Analytics
Business Rules
Cognitive Analytics
The Accuracy of Analytical Outputs
Conclusion
Notes
CHAPTER 2: Artificial Intelligence
Summary
Introduction
The Turing Test
The Dartmouth Event
Post-Dartmouth, the AI Winter, and Singularity
Springtime for AI?
How Does AI Work?
Can Computers Be Creative?
Conclusion
Notes
CHAPTER 3: The Impact of AI on Leading-Edge Industries
Summary
Introduction
Financial Services
Automobiles
Media, Entertainment, and Telecom
Retail
Conclusion
Notes
CHAPTER 4: The Impact of AI on Second-Mover Industries
Summary
Introduction
Construction
Utilities
Public Services
Agriculture
Technology Industry
Conclusion
Notes
CHAPTER 5: The Impact of AI on Professions
Summary
Introduction
Work and Professions
The Importance of Competences
The Morevec Paradox and Why It Threatens Professionals
Management
Office of Finance
Legal Profession
Sales and Marketing
Retailers
Commercial Media
Transportation
Engineers and the Built Environment
Medical Profession
Data Centres
Entrepreneurs
Conclusion
Notes
CHAPTER 6: Risk and Regulation
Summary
Introduction
What Is Risk?
Technology and System Failures
Data Security and Privacy
Employee Error and Fraud
Inadequate or Failed Procedures, Systems, and Policies
Reputational Risk
External Risk
Financial Risk
AI and the Future of Compliance
Roles, RegTech, and Forgiving the Machine
Conclusion
Notes
CHAPTER 7: Implementation Road Maps
Summary
Introduction
New Thinking on Employee Training
Robotics and Process Automation
Implementation Frameworks
Is Big Bang Transformation Possible?
Conclusion
Notes
CHAPTER 8: New Business Models
Summary
Introduction
Augment or Automate?
Issues of Place and Time
Contextual Insight
Wordplay and Communication
New Business Models for New Markets
Conclusion
Notes
CHAPTER 9: Coping with the Future
Summary
Introduction
Existing Roles in AI
Future Roles in AI
AI Education
Personal Capabilities for Success
Can Computers Innovate?
Living with Robots
Elderly Healthcare and Robots
Taking Instructions and Advice from Computers
Rules for Robots
Conclusion
Notes
CHAPTER 10: Strategies for Personal Reinvention
Summary
Introduction
The Need for Personal Reinvention
How Easy Is It to Change?
The Importance of Events and Conferences
The Freedom of Franchises – from Employee to Owner
Can We Cope with Doing Nothing?
Third-Age Thinking
Conclusion
Notes
APPENDIX A: Implementation Flowcharts
APPENDIX B: Jobs Most Affected by Artificial Intelligence
APPENDIX C: List of Professional AI Organisations
APPENDIX D: List of Tables
APPENDIX E: List of Figures
Index
End User License Agreement
Prologue
TABLE 1 Murray's table of needs
TABLE 2 Religious belief by generational cohort
TABLE 3 Founder age of US $1 billion VC-backed private companies
Chapter 1
TABLE 4 Typical capabilities used in advanced analytics
TABLE 5 Uses of advanced analytics
TABLE 6 Key skills and capabilities of BPM practitioners
Chapter 3
TABLE 7 The Luddites200 Organising Forum's objections to technological change
TABLE 8 The five-stage process of implementing AI in autos
TABLE 9 AI extract, Gartner Hype Cycle report
Chapter 4
TABLE 10 2015 market for commercial drones in US$ billion
TABLE 11 Costs to Britain of workplace injury and new cases of work-related ill health by industry, 2015/16
Chapter 5
TABLE 12 Core competences in the workplace
TABLE 13 The new role of the manager
TABLE 14 The new role of the CFO
TABLE 15 The new role of the lawyer
TABLE 16 The new role of sales
TABLE 17 The new role of marketing
TABLE 18 The new role of the retailer
TABLE 19 The new role of the creative artist
TABLE 20 The new role of the publisher
TABLE 21 Future competences for pilots
TABLE 22 Future competences for building engineers
TABLE 23 Future competences for building planners
TABLE 24 Future competences for general practitioners
TABLE 25 Future competences for dentists
TABLE 26 Future competences for entrepreneurs
Chapter 7
TABLE 27 Comparison between unattended and attended automation
TABLE 28 Pros and cons of big bang implementation
Chapter 8
TABLE 29 Components of an intelligent business model
TABLE 30 Key drivers of economic growth in India, Brazil, and China
Chapter 9
TABLE 31 AI: reasons to be hopeful, or not
Chapter 10
TABLE 32 Size of self-help marketplace
Prologue
FIGURE 1 Maslow for a new age.
Chapter 1
FIGURE 2 The road to artificial intelligence.
Chapter 2
FIGURE 3 The AI ecosystem.
Chapter 3
FIGURE 4 Segmentation of the banking industry.
FIGURE 5 Segmentation of the insurance industry.
Chapter 4
FIGURE 6 Intelligent and integrated construction industry.
Chapter 5
FIGURE 7 Defragmentation of publishing.
Chapter 8
FIGURE 8 Intelligent business models.
Chapter 10
FIGURE 9 Maslow for the Third Age.
List of Flowcharts
FLOWCHART 1 Implementation road map.
FLOWCHART 2 Are you worried about the future?
FLOWCHART 3 Personal reinvention.
FLOWCHART 4 Applying your skills to industry.
FLOWCHART 5 Implementing an AI project.
FLOWCHART 6 Managing risk in an AI implementation.
FLOWCHART 7 Building an AI implementation team.
FLOWCHART 8 Managing the benefits.
FLOWCHART 9 Impact on professions.
FLOWCHART 10 Are you a member of a professional organisation?
Cover
Table of Contents
Begin Reading
C1
ii
iii
iv
xi
xiii
xiv
xv
xvi
xvii
xviii
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
E1
Founded in 1807, John Wiley & Sons is the oldest independent publishing company in the United States. With offices in North America, Europe, Australia and Asia, Wiley is globally committed to developing and marketing print and electronic products and services for our customers' professional and personal knowledge and understanding.
The Wiley Finance series contains books written specifically for finance and investment professionals as well as sophisticated individual investors and their financial advisors. Book topics range from portfolio management to e-commerce, risk management, financial engineering, valuation and financial instrument analysis, as well as much more.
For a list of available titles, visit our website at www.WileyFinance.com.
TONY BOOBIER
This edition first published 2018
© 2018 John Wiley & Sons, Ltd
Registered office
John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, United Kingdom
For details of our global editorial offices, for customer services and for information about how to apply for permission to reuse the copyright material in this book please see our website at www.wiley.com.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by the UK Copyright, Designs and Patents Act 1988, without the prior permission of the publisher.
Wiley publishes in a variety of print and electronic formats and by print-on-demand. Some material included with standard print versions of this book may not be included in e-books or in print-on-demand. If this book refers to media such as a CD or DVD that is not included in the version you purchased, you may download this material at http://booksupport.wiley.com. For more information about Wiley products, visit www.wiley.com.
Designations used by companies to distinguish their products are often claimed as trademarks. All brand names and product names used in this book are trade names, service marks, trademarks or registered trademarks of their respective owners. The publisher is not associated with any product or vendor mentioned in this book.
Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. It is sold on the understanding that the publisher is not engaged in rendering professional services and neither the publisher nor the author shall be liable for damages arising herefrom. If professional advice or other expert assistance is required, the services of a competent professional should be sought.
Library of Congress Cataloging-in-Publication Data
Names: Boobier, Tony, 1956– author.
Title: Advanced analytics and AI : impact, implementation, and the future of work / by Tony Boobier.
Description: Chichester, West Sussex, United Kingdom : John Wiley & Sons, 2018. | Series: Wiley finance series | Includes bibliographical references and index. |
Identifiers: LCCN 2018003398 (print) | LCCN 2018005453 (ebook) | ISBN 9781119390923 (pdf) | ISBN 9781119390930 (epub) | ISBN 9781119390305 (cloth)
Subjects: LCSH: Management—Statistical methods. | Artificial intelligence—Industrial applications.
Classification: LCC HD30.215 (ebook) | LCC HD30.215 .B66 2018 (print) | DDC 658.0072/7—dc23
LC record available at https://lccn.loc.gov/2018003398
Cover Design: Wiley
Cover Images: blurred people © blurAZ/Shutterstock;
hand touch © whiteMocca/Shutterstock;
hand touch © monsitj/iStock
I owe an enormous debt of gratitude to family, friends, colleagues, acquaintances and even strangers who were willing to share their views over the past few years on this most complex and interesting of subjects. It seems everyone has a point of view, which is a good thing.
Thanks also to the staff of Wiley who have produced this book, especially to Thomas Hykiel as the original commissioning editor and subsequently Gemma Valler who brought this project to a conclusion.
I'm especially grateful to my wife Michelle not only for her support but also for her observations and advice, leaving me in no doubt as to the meaning of ‘better half’.
This book is especially written for my grandchildren who will live with the consequences of all these changes.
Let's start with a true story about the Battle of Waterloo, which was fought on Sunday 18 June 1815.
Facing each other were the French emperor Napoleon Bonaparte, who for more than a decade had dominated European and global affairs, and Arthur Wellesley, the Duke of Wellington, who had made his military name during the Peninsula Campaign of the Napoleonic Wars, and ultimately rose to become one of Britain's leading statesmen and politicians.
Waterloo is located about 15 kilometres south of Brussels. On that day, the French army comprised about 69,000 men and faced an opposing force of about 67,000 troops, although this number was to swell to over 100,000 with the arrival of Prussian allies before the end of the day. By nightfall, Wellington emerged as the victor, but nearly 50,000 from both sides were dead or wounded. According to Wellington, this was ‘the nearest-run thing you ever saw in your life'.
There are many explanations for his success. One that resonates is that that there is evidence that he was in the area during the summer of 1814, having taken a wide diversion from his route from London to Paris where he was taking up his new role as British ambassador to the court of Louis XVIII. Rather than taking the more direct route from Dover to Calais, he sailed on HMS Griffon to the Belgian port of Bergen Op Zoom, accompanied by ‘Slender Billy', the 23-year-old Prince William.
He spent two weeks touring the Lowlands, and the valley south of Brussels seemingly caught his attention. There's a suggestion that he stayed at the inn La Belle Alliance, a location that was to play a part in the eventual battle.
At that time there was no hint on the horizon that he would ever fight his old adversary Napoleon, and perhaps his visit was simply the old habit of a retired soldier. During the battle he was so aware of the terrain that he was able to deploy his troops to the greatest effect. During the fighting he took care to allocate particular regiments to protect key defence points, such as Hougoumont. Without these insights, some argue that Wellington's success would have been uncertain.
Two hundred years later, perhaps there is a still lesson to be learned from this encounter.
Whilst we shouldn't think of the introduction of AI to business as being a battle, there are definitely significant challenges ahead. How well we humans prepare and respond to that environment will depend significantly on how prepared we are. Like Wellington, understanding the terrain may not be enough in itself, but it will provide a useful indicator about what might happen and what we should do about it.
This book can't provide all the answers, or even all the questions. Perhaps, at best, all it will give us is some sort of compass in a sea of data and analytics that will provide guidance as to how the world of work will evolve. But in uncertain oceans, isn't a compass still useful?
It seems that almost every time we pick up a newspaper or read an online article, there is some reference to AI. It's difficult not to reflect on how it may – or may not – change the way we live and how we will work in the future. As we read the articles, we can become either excited or confused (or perhaps both) about what AI really means, why it's happening, and what will be the consequence.
The articles tend to be either quirky or technical. On the one hand, they suggest how AI can help choose the best and quickest route, keep the elderly from feeling alone, and assist with the best retail choice. On the other hand, technical articles also imply that beneath the covers are numerous algorithms of a complexity that normally gifted humans cannot possibly understand – and that this topic is best left to expert academics and mathematicians with deep statistical insights.
These experts seem at face value to be the people whom we will have to trust to create some sort of compass or road map for all our futures, yet how much do they understand your world or your work?
AI is a topic that is much more important than a means of simply providing a clever satellite navigation scheme or some form of novelty tool for aiding personal decisions. It is a concept that potentially goes right to the core of how we will work and even how we will exist in the future. As individuals, we should not only feel that we have the right to know more about what this matter is actually concerning, but that we should become contributors to the discussion. Through greater understanding we become more empowered to enter into the debate about the future, rather than leaving it to others. But beyond simple empowerment, don't we also have a duty to become part of the discussion about our future – that is, your future?
This isn't the first book about AI and certainly won't be the last. But readers who don't have deep technical, academic qualifications or experience in computer science or advanced mathematics increasingly need to understand what is actually going on, how it will affect them going forward, how best to prepare, and what they can do about it.
It's important to be realistic about the time frame involved. It wouldn't be to anyone's benefit to worry unduly today about a technology that won't be in full implementation for another quarter or half a century, but many suspect it will happen much sooner than that. In many places there is evidence of it already beginning to happen. Industries, professions, and individuals need to be prepared, or to start to become prepared.
A recent paper, ‘Further Progress in Artificial Intelligence: A Survey of Expert Opinion', interviewed 550 experts on the likely timescale for development of AI.
In the paper, 11% of the group of eminent scientists surveyed said that we will understand the architecture of the brain sufficiently to create machine simulation of human thought within 10 years. And of these, 5% suggested that machines will also be able to simulate learning and every other aspect of human learning within 10 years. They also predict the probability of the machine having the same level of understanding and capability as a Nobel Prize-winning researcher by 2045.
Of that group, even the most conservative thinkers indicated that they believe there is a ‘one-in-two’ chance that high level AI ‘will be developed around 2040–2050, rising to a nine-in-ten chance by 2075’.1 Who can really be sure?
It's impossible to make predictions about timing with certainty. Some people might have doubts about implementation timelines proposed by academic experts. On the other hand, businesses that operate in demanding and cutthroat climates are continually looking for competitive advantage, which invariably comes from appropriate technological advances. The drive for competitive advantage, most probably through cost cutting, will force the development timetable. To do so effectively requires business practitioners to better understand technology, and for technologists to have a greater grasp on business pains and opportunities.
As market conditions increasingly accelerate the pace of change, there is a real possibility – or more like a probability – that some professions within certain industries will be using some forms of AI within the next 10 years; that is, by the mid-2020s. Whilst many organisations remain obliged to manage their progress in terms of a series of short-term goals, in strategic terms this date is just around the proverbial corner, and they need to start working towards it now.
Even if the more conservative, longer-term view (that we will not see AI until 2040) is taken, the shift to AI will almost certainly occur within the lifespan of the careers of graduates and interns joining industry today. In their book The Future of the Professions, lawyers Richard and Daniel Susskind make the case that professionals (especially those between the ages of 25 and 40) need to have a better understanding of the potential paradigm shift from the influence of technology on the way they work, suggesting that ‘professions will be damaged incrementally’.2
This is not an issue that will only affect individuals working at that time. Those still working today, who will have finished their full- or part-time employment within a decade, will find their daily personal affairs being increasingly influenced by AI in terms of services provided to them.
The issue therefore may not be what and when, but rather how. The problem may not be of crystallising what we mean by AI, or conceptualising what we can do with it, but rather how it can be effectively and sensibly deployed.
Some of these same issues have already occurred due to the adoption of advanced analytics (i.e. predictive and prescriptive analytics), so we will attempt to consider the question of implementation from a practical point of view. Although the implementation time frame of one decade or even three is not absolute, this book makes the brave assumption that AI in the form of advanced analytics will eventually be with us in one form or another. Regardless of the period of time involved, the book proposes that there are a series of incremental building blocks and an optimum implementation route that should be followed. If organisations are to take advantage of AI within a single decade, then the journey to change needs to start immediately.
Some industries are more likely to be affected by AI than others: those that involve much repetitive decision-making, have extensive back-office functions, or are not specifically customer facing are particularly suited to AI implementation. They will respond and implement at different speeds but changes as a result of AI will lead to an environment of knowledge sharing. It is entirely feasible that we will see the sharing and cloning of complementary technologies used in quite diverse markets, such as consumer goods, retail, financial services, medicine, and manufacturing. Effective transfer of technologies and capabilities from one industry to another may ultimately become one of the most critical types of innovation going forward.
Manufacturing will increasingly and rapidly embrace robotics driven by superadvanced, or cognitive, analytics. But to what degree should specialist professions, such as dentists, surgeons, publishers or even many parts of the creative-arts sector, feel threatened?
There will also be immense cultural issues for the workforce to cope with. To what degree will our traditional understanding of the meaning of work change? The book will consider who will suffer (or benefit) the most. Will it be the blue-collar workers, whose role will become partly or fully automated? Will it be knowledge workers, who find that their most valuable personal commodity – knowledge – has become devalued and replaced by super search engines operating in natural language? Alternatively, will it be the business leader, whose authority, based on experience and judgement, will be undermined by systems offering viewpoints on the probability of success of any given decision?
In any event, how will business leaders even be able to lead unless they have personal experience? The very nature of leadership will need to change, and we will look at that as well. What can any – or all – of these groups do to prepare themselves?
Location may also be a key driver for change. In some growing markets, such as Asia and Latin America, new AI technologies could become the first resort for providing services where there has been a massive existing or potential market unsupported by adequate professional talent. The consequence of this could be that relatively immature marketplaces could start to leapfrog established practices to satisfy market need. What might be the implications of creating a new global world order, in terms of the use of machine learning?
We will also think about the impact of change through AI on existing business models. Traditionally, the way of doing work has been relatively linear in nature: one thing happens, and then another thing happens. Will the use of AI herald a change to that modus operandi, and if so, then how? What also will be the impact on traditional views of operational risk (risks of failure of systems, processes, people, or from external events) – especially if the decisions are being made by computers in an automated way?
One of the key enablers for change rests with professional institutions in whose domain is vested the awarding of professional qualifications. Many of these institutions are already struggling with the concept of big data and analytics as they try to convince their members that these trends are more than a fad or hype. In the near future an even greater burden will fall on their shoulders to carry the flag for AI and for new ways of working.
The choice whether to do this or not is not negotiable, insofar as on the whole the younger members of these institutions will increasingly adopt what are described as liquid skills, which reflect a new way of learning, to broaden their personal capabilities. Increasingly, many younger professionals see the ultimate goal of personal development and upskilling as being that of the ability to go solo in the world of work and to earn a crust through value creation rather that a regular paycheck. To what degree will this affect professional institutions and how will AI help – or hinder – this aspiration?
This book is not about the deepest technical details of technology and mathematics – although we will touch on these to give context and raise awareness – but rather aims to help individuals understand the impact on their business environment and their careers. As far as practically possible, it will help practitioners start to ‘future proof’ their careers against changes that are already beginning to happen, might occur in under a decade, and almost certainly will occur afterwards.
AI is not a subject without potential controversy. Not only are there technical and professional issues to contend with, but there are also some ethical aspects to consider as well. At a broader level, readers will gain a level of insight that allows them to contribute to the wider discussion in a more informed way.
Beyond this, the book aims to help employers supported by professional institutions start to ensure that their employees and their leaders have the right skills to cope with a world of work that is transforming rapidly and radically.
Overall the focus is on raising awareness in individuals, professional organisations, and employers about a future world of work that will be with us sooner or later. My guess is sooner – and that there is no time to lose.
1
. Muller, Vincent C. and Bostrom, Nick (2016). Future progress in AI: a survey of expert opinion, 553–571. Paper. Synthese Library, Springer.
2
. Susskind, Richard and Susskind, Daniel (2015).
The Future of the Professions
. Oxford University Press.
This chapter sets the scene for a new work ethos in a data-fuelled business environment. It considers the evolution of work, taking into account the relationship between employer and employee; the origin and development of the work ethic; and the different motives of the individual in the workplace, especially the young entrepreneur and aging employee. Beyond this, it reflects on the future validity of Maslow's hierarchy of needs and suggests new prioritisations.
The writer H.G. Wells (1866–1946) was no fool. Although he anticipated a journey to the moon in 1901, his writing was more in the nature of scientific romance. He wrote of time machines, war of the worlds, and the invisible man, but beyond all this speculation he thought hard about the impact of change on society. He even imagined a future society whose members at some stage had taken divergent paths: a hedonistic society called the Eloi, focused on leisure and self-fulfilment, and a manual underclass that he called the Morlocks. The Morlocks had regressed into a darker world, even to the point of working underground to ensure that the Eloi would have luxury. It's a dark tale from Wells's The Time Machine, about a world many centuries into the future.
Who knows whether Wells will be right or wrong? As we will see later in this book, science fiction writers seem to have an uncanny knack of anticipating the future. We'll never really know whether this is because they put ideas into the minds of man, whether they have some divine inspiration, or whether it's purely coincidental. A professional colleague of mine who describes himself as a futurist tells me it's the easiest job in the world. After all, he says, who today will be around later to say whether the predictions are right or not?
As we consider the whole issue of the influence and application of technology, and specifically artificial intelligence, on work, then we need not only to look forward but also backward. What is this concept of work anyway?
There's no real doubt that the meaning of work has continually changed. By way of example, contrast the child working in what William Blake termed the ‘dark Satanic Mills’ of Victorian England, where there was a constant risk of losing a finger (or worse) in the cotton loom, with those working in the relative safety of the so-called flat white economy of London's Shoreditch today. The flat white economy is a term that references the most popular type of coffee ordered by start-up entrepreneurs, whose idea of working is to forsake a regular salary in favour of the prospect of creating (and ultimately selling) an innovative technological gold mine.
A few decades ago, the ambition of most university graduates was to survive the so-called milk round (an expression used by prospective employers who visit multiple universities – like a milkman delivers milk from house to house – to seek out the best talent). The milk round still exists, but finding a steady job with a linear career path is not the most important thing for some of today's grads. Entrepreneurship informs the zeitgeist of the moment. I recently fell into conversation with a young Canadian woman in her early twenties, working as a guide at the Design Museum in London. On enquiring, I discovered that it was only a temporary job for her, as she was looking to join a suitable start-up in London. What was more interesting to me was that she had quit her job at a leading technology corporation in the United States, forgoing its regular paycheck, to travel overseas and seek her fortune – an ambition, perhaps, indicative of the times.
Entrepreneurship isn't confined to bright young things. Increasingly, major corporations are offloading skill and experience in favour of young, new thinking – even if cost cutting is probably part of the real agenda. Older workers of both genders shouldn't take it personally, even if it may slightly hurt their pride. They too may respond by finding new market opportunities, attaching themselves to start-ups, or even starting something themselves.
For that older generation, the world of work has changed as well. More and more they have needed to understand the impact of change and adapt accordingly. They are like the proverbial old dogs learning new tricks.
There's also a sense of regaining the balance between work and play. For many younger people in the workplace, the division between the two has narrowed, or possibly even disappeared. The expression working from home has entered into our vocabulary. At the same time, office-based workers find themselves still working excessive hours, making leisure time something to be grabbed rather than something to which they are entitled. With so many of the big jobs located in the city, regardless of the country, and with city accommodation and commuting so costly, it's really not surprising that the focus of workers is on career advancement and salary improvement. But won't automation and AI undermine that way of thinking, and if so, then how?
How did we find ourselves here? And more importantly, what will this new age of work bring?
Let's start with slavery. It's an unattractive and disturbing subject. For many ancient cultures, the concept of slavery did not exist. Men apparently did the hunting and women did the rest – which at least seems to suggest some division of labour from the outset. (In honesty, it's a bit uncertain and all we can really do is speculate.) But alongside, and perhaps as a result of, creating divisions of labour, civilisations seem to have created an environment for servitude, and the idea of slavery had established itself by the time of the ancient Greeks and Romans. Sir Moses Finley, professor emeritus of ancient history at Cambridge University, identifies the five key slave societies as being ancient Rome, ancient Greece, the United States, the Caribbean, and Brazil.1 It's a complex and controversial subject, and Finley makes the point that conditions for slaves were entirely dependent on the owner's disposition, which might be kind, cruel, or indifferent.
There are not many religious arguments in slavery's defense. Finley says that even many early Christians were slave owners, but that slaves' treatment and how these individuals were ultimately looked after was perhaps also a matter of the disposition of the owner. Sometime in the mid-first century the Roman writer Columella wrote about the treatment of slaves, recommending the stick as well as the carrot. Overall there was a general consensus among Romans about the virtues and financial benefits of a balanced approach to servitude (on the part of the owners).
Slavery did not disappear with the fall of Rome. The word itself is derived from the Eastern European word Slav, which is a term passed down from very old times. The Latin word for slave, servus, is the basis for the term serf, which combines the idea of servitude with the right of the individual to have some degree of control over property, if not necessarily ownership of it.
The Roman way of life was to be increasingly undermined by ancient Rome's two-level society. Some historians suggest that it was the moral ‘flabbiness’ of the ruling class that ultimately resulted in Rome falling to the Germanic hordes in AD 410.
The other side of slavery's coin is freedom, a notion which the ancient Greeks recognised as they consulted the Pythia, the priestesses at the temple known as the Oracle at Delphi in upper central Greece. On the walls of the temple there were definitions of the four elements of freedom:
Representation in legal matters
Freedom from arrest and seizure
The right to do as one wished
The right to go where one wished.
It follows that one definition of slavery in ancient Greece can be stated by laying out the opposite of these values – for example, that the slave is represented by the master, that the slave must do what the master orders, and so on.
Two thousand years on, the expression freedom seems to have taken on a new set of values. Franklin Roosevelt in 1941 spoke of a world founded on four freedoms:
Freedom of speech and expression.
Freedom of worship.
Freedom from want.
Freedom from fear.
Some suggest that the final two of these freedoms – want and fear – have in particular driven the notion of work as we know it. In a consumer-driven society there is a desire not only to feed the family but also to keep up with peers. The notion of fear perhaps might be best represented by the anxiety of not being in employment and therefore being unable to buy those essential things, be they for survival or enjoyment. To what degree are we fearful about not being in work and not having an income, and how will that fear show itself in a future technological age?
Perhaps slavery is somehow linked to a struggle of classes and hierarchies, as Karl Marx suggested was the case in his Communist Manifesto. He wrote, ‘The history of . . . society is the history of class struggle’, oppressor and oppressed, ‘in constant opposition to each other’.
Yet at the same time it has more often than not been possible for a servant to become a master, especially in a meritocracy. Learning and education seem to be key enablers or catalysts that allow this to happen, but they are frequently coupled with a bit of good fortune, and, from time to time, a helping hand.
The notion of work therefore seems to be unavoidably attached to servitude, through which we gain some form of freedom by not being in need or in fear. The now infamous phrase Arbeit macht frei (Work sets you free), forever to be associated with a sinister regime, comes from the title of a 1873 novel by German philologist Lorenz Diefenbach, in which gamblers and fraudsters find the path to virtue through labour.2
The opposite of work is leisure. There appears to be a time and place for some downtime of sorts. Few people would begrudge the leisure of others – perhaps provided that the leisure has some degree of moderation and is not flaunted. After all, isn't leisure the reward for work? If we work to earn money for essentials, then isn't leisure one of the ways in which we choose to spend any surplus? And at the end of the day, how do we define work anyway? Maybe the work of a musician or a writer is as hard as that of a miner, albeit a quite different kind of labour. The rock musician David Lee Roth summarizes it like this: ‘Money can't buy you happiness, but it can buy you a yacht big enough to pull up right alongside it’.
Perhaps working is not optional but essential. After all, as St Paul put it over 2,000 years ago, ‘If any would not work, neither should he eat’.
Our generation stands in the shadow of the great industrial age. We compare the era of big data with the industrial ages of steam, hydrocarbon, and electricity. The great industrialists, such as Arkwright, Brunel, Carnegie, and Ford, to name but a few, were not only entrepreneurial but also had an ability to make changes happen at scale, even at the price of wringing every drop of sweat from their employees. Some industrialists even recognised the social impact on their employees and created special small communities for them.
Bourneville, a small village south of Birmingham in the United Kingdom was created by the chocolate-making Cadbury family in the 1890s, not only to ensure that their workforce was optimally placed close to the factory, but also to provide facilities such as parkland for health and fitness. The Cadbury family is not unique. Port Sunlight, south of Liverpool, was created by Lever Brothers (now part of Unilever) in 1888 to house its workers and was named after its most profitable product, Sunlight soap.
But even if these worker villages appear to have been created out of altruism, fundamentally they were founded on what we might describe as the work ethic. With its origins in Lutheran Germany, the Protestant Martin Luther challenged the Roman Catholic hierarchy in 1517 by nailing Ninety-Five Theses to a wall in Wittenberg, in which work he poured contempt on the ‘lazy’ comfort of the Catholic Church. According to the Bible, God demands work in atonement for original sin – brought about by Adam's eating of the forbidden fruit in the Garden of Eden – and Luther made no secret of that.
Luther had created a new type of religion that combined worship with hard work in demonstrating devotion to God. This was a ‘business model’ that was to be further reinforced by John Calvin. Calvin was a French theologian who lived at the time of the Protestant Reformation and who, like other Reformers, understood work and service to be a means by which believers expressed their gratitude to God for the redemption of Christ. Beyond this, he implied that economic success was a visible sign of God's grace – an idea ultimately taken further by Max Weber, the German sociologist. Weber wrote The Protestant Ethic and the Spirit of Capitalism in 1904, suggesting in it that the Protestant work ethic was one of the underlying (but unplanned) origins of capitalism and the growth of prosperity. Weber's ‘spirit of capitalism’ is said to consist of a set of values which comprise the spirit of hard work and progress.
What Weber argued was, in effect:
That religious doctrine compelled an individual to work hard, but in doing so he or she could become wealthy.
That the purchasing of luxuries was sinful, as were charitable donations (as they encouraged laziness on the part of those receiving the benefit).
That the best way to reconcile these differences was through investment, which was a nascent form of capitalism.
As we consider the challenges of work not only today but going forward, we often fail to recognise that the underlying driver of hard work might appear to be seated not only in a very traditional approach to servitude, but also in the deep religious beliefs that have become ingrained in our work psyche.
The mood for change was an international movement. Benjamin Franklin, Thomas Carlyle, and John Stuart Mill, amongst others, all had something to say about the rise of capitalism and industrialism. Mill especially ‘looked forward beyond (this) stage of Protestant-driven industrialisation to a New Age where concerns for quality, not quantity would be paramount’.3
Leap forward more than half a century. In the interim, the world has suffered World War I, during which the generals increasingly turned to industry to supply massive amounts of munitions, and World War II, which Peter Drucker has described as an ‘industrial war’. Both of these major events, especially the latter, created the context for a new view of corporations in terms of how work itself functioned. At General Motors, Drucker not only gained a greater understanding about organising work but also about the functions of management. The lessons of the ‘industrial’ World War II taught many in management about chains of command, hierarchy, and the impact of scale.
Throughout that time, the work ethic remained sound and true. In 1934 General Motors recruited the consultant James ‘Mac’ McKinsey, who formerly had been a professor of accounting at Chicago University and who formed the McKinsey Company in 1926 at the age of 37. At that time he was the highest paid consultant in the United States, at US$500 per day. Within three years he had died as a result of illness brought on by the pressures of work. It's said he was at the office six days per week, brought his work home on Sundays, and was consumed by his responsibilities. He is seen as an embodiment of the Calvinistic work ethic that we have been describing.
Today, McKinsey Consultants is a very well-known and well-respected company, and the work ethic instilled by James McKinsey seems not to have changed substantially. A 2005 newspaper article in The Guardian that discussed McKinsey providing advice to the UK Prime Minister Tony Blair reminded readers that at McKinsey ‘hours are long, expectations high and failure not acceptable’.4 There's no doubt that McKinsey's employees – who are called ‘members’ (McKinsey calls itself ‘The Firm’) – are motivated not only by financial reward but by the trust bestowed on them by their clients and the recognition of their peers. For them, work seems to have taken on a meaning beyond drudgery. Some might even say that it is a form of religion.
What makes us want to work, anyway? Abraham Maslow, an American psychologist who was Jewish, was curious about this very topic, and found some enlightenment in the experiences of Holocaust survivors. He wanted to understand what motivated some to survive while others just gave up. He recognised a link between motivation and psychological development. From this he concluded that, in the workplace, employees worked better if they experienced a feeling of self-worth: in other words, if employees felt as if they were making meaningful contributions.
His book Maslow on Management was influenced by the work of Henry Murray, who had previously identified what he believed to be the 20 needs of all people, which he explained in his book Explorations in Personality. These needs were categorised into five key groups by Murray: ambition, materialism, power, affection, and information (see Table 1).
TABLE 1 Murray's table of needs.
Source: K. Cherry, Murray's Theory of Psychogenic Needs, Verywell (1 January 2015). http://psychology.about.com/od/theoriesofpersonality/a/psychogenic.htm (accessed 4 May 2015)
Ambition
Materialism
Power
Affection
Information
Achievement
Acquisition
Abasement (apologising and confessing)
Nurturance (caring for others)
Exposition (educating others)
Exhibition (the ability to shock others)
Construction
Autonomy (independence)
Play
Cognizance (seeking knowledge and asking questions)
Recognition (gaining status, displaying achievement)
Order (making things organised)
Aggression
Rejection
Retention (keeping things)
Blame avoidance
Succorance (being protected by others)
Deference (cooperation and obedience)
Dominance
Maslow refined the work by Murray. He identified five human desires, in what has come to be known as his ‘hierarchy of needs’, which are (in ascending order): physiological (i.e. hunger and thirst), safety, love, esteem, and self-actualisation. The satisfaction of a need lower in the order allows for the pursuit of the next higher one. The highest of these needs, self-actualisation, is described as the fulfilment of the talent of the individual, as expressed by creativity. It is often accompanied by a quest for spiritual enlightenment and a desire to positively transform society.
How do these needs respond to the workplace, and more importantly, to the work ethic? Is it really possible for a worker doing a mind-dulling, repetitive job to be creative and obtain a level of spiritual fulfilment? How might this also apply to positions of responsibility in the workplace? Frederick Hertzberg, professor of management at the University of Utah, proposed that ‘job enrichment’, that is, enlarging the job to give the employee greater self-authority, was one way forward. In his 1959 book The Motivation to Work, Hertzberg identified what we now understand to be the key drivers of satisfaction in the workplace – the factors that spurred individuals on to be motivated about their jobs – and how employers might get the most from their human assets by satisfying these key drivers.
Hertzberg's theory assumes that everyone is the same and is similarly motivated. Even Maslow recognised the simplistic nature of these categorisations. Later Maslow was to expand on these, saying that his thinking was based on key assumptions, including that humans prefer work over idleness and meaningful work over useless work.5
The question for today, and looking forward, is whether Maslow's approach is still valid for Gen Y (Gen X refers to those born between 1960 and 1980; Gen Y between 1981 and 2000). And how will his concepts apply to the post-2000 demographic that we know as Gen Z?
What will we name the group that comes after Gen Z? The jury seems to be out on that one, but the label Gen Alpha is getting some traction, if only because marketers like to have a system of categorisation and segmentation. Industry is increasingly moving to a so-called segment of one (i.e. dealing with consumers as individuals rather than as clusters or groups with similar behaviours). This is based on the ability of companies to understand the unique characteristics of individuals through access to big data. Will the need to categorise people into groups for the purpose of marketing, like many forms of work, simply start to die out as a result?
Equally important, as we consider the impact of technology on the nature of work and professions, and the approach to work more commonly being taken by a younger generation, is it perhaps time to rethink Maslow's hierarchy (see Figure 1)?
FIGURE 1 Maslow for a new age.
Gen Z, sometimes known as post-millennials or the iGeneration, are usually characterised by their access to a connected world. It is the first truly digital-native generation, communicating frequently (but not always) with words and still coming to terms with its cultural influences.
Its members will be an enormous factor in the economy of the future. By 2020 these young adults are likely to wield US$3 trillion of purchasing power in the United States alone. How they spend their time is different from what previous generations have experienced. In the 13-to-24-year-old bracket, 96% watched an average of 11 hours of online video per week, and 42% say that social media impacts their self-esteem.6
Even the way that they work is changing. The flat white economy, to recap, is a term applied to digital entrepreneurs chasing their fortunes with a winner-take-all mentality. Regular pay is minimal, but for those who hit the jackpot the benefits and rewards can be enormous. Only a few players are lucky enough to reach so-called unicorn status, named after the mythical horned horse. For this chance, they are prepared to trade economic safety and security for uncertainty, their extravagances often being confined to the latest high-tech kit: phone, laptop, and access to the latest apps.
The number of so-called flat whiters is growing globally, in locations as diverse as London, Paris, Moscow, Israel, Bangalore, and Beijing, to mention but a few. Like moths, they are attracted by the bright lights of a vibrant social scene, coupled with low-cost accommodation. In many cases, government help also provides a catalyst.7
It's impossible to question their commitment to seeking a fortune, even if their commitment to continuous employment is a little more dubious. Few seem prepared to work on the same project or with the same employer for more than a couple of years. In fact, continuous employment in the same place can even be seen as a bad thing.
They are constantly invited to break the mold and to be smarter about their relationship with the workplace. Writers such as Steven Levitt and Stephen Dubner ask their readers to ‘think smarter about everything’. Their Freakonomics series of books, such as the one titled Think Like a Freak, challenges conventional workplace wisdom.
Levitt and Dubner also have asked what makes people truly happy, and have set out four key tenets:
Incentives are the cornerstone of modern life.
Knowing what to measure makes a complicated world less so.
Conventional wisdom is often wrong.
Correlation does not equate to causality.
They make the point that individuals often let their biases, such as political or economic bias, colour their world. The books seem to reflect the zeitgeist of the age, implying that the old way of thinking has become less relevant. Perhaps our historical approach to work is also changing, and the notion of the work ethic is diminishing equally in some way.
The members of Gen X and subsequent demographic groups are increasingly less likely to believe in God, at least in a traditional sense (see Table 2). If there is a link between work ethic and religion, indications are that if these trends continue, then religion will increasingly fall off the radar for members of these generations, perhaps to be replaced by some other form of spirituality.
TABLE 2 Religious belief by generational cohort.
Source: Pew Research Center (n.d.), Religious Landscape Study. http://www.pewforum.org/religious-landscape-study/generational-cohort (accessed 14 August 2017).
Generational cohort
Belief in God
Younger millennial
50%
Older millennial
54%
Generation X
64%
Baby boomer
69%
The idea of spirituality in the workplace isn't new. The notion of spirituality in the workplace, as distinct from religion, first emerged in the 1990s. It is characterised by an approach sometimes described as ‘holistic thinking’, whereby the worker has a heightened level of altruistic or unselfish awareness towards others.8
Buddhism, too, has an outlook on the concept of work. Originating sometime between the sixth and fourth centuries BCE, Buddhism has been described as ‘a path of practice and spiritual development leading to insight into the true nature of reality’.9 Buddhism also appears to have a view on the workplace. Tsunesaburo Makiguchi (1871–1944), a Japanese educational theorist, religious reformer, and founder of the largest lay Buddhist organisation in Japan, suggests in his theory of value that there are three kinds of value in the workplace: beauty, benefit, and good.
In the realm of employment, the value of beauty means to find a job you like; the value of benefit is to get a job that earns you a salary so that you are able to support your daily life; the value of good means to find a job that helps others and contributes to society.10
Where does this leave us? As we start to look forward to a world of data-fuelled work, we are forced to leave behind some of the older notions of employment. Perhaps adopting a new spiritual approach might help to provide some form of template for the future world of work.
It'll be easier for those who will actually be working in the future, however that work is fashioned, to be able to look forward more positively. After all, won't they be the ones to create the rules of work and invent new jobs? Perhaps all they will have to worry about is when, and not how, to pay the bills.
On the other hand, older generations will need to come to terms with the prospect of mass unemployment, as most routine tasks become automated and even many complex decisions become computer assisted. How will the older members of Generation X and those born later, who may become the victims of change, actually cope?
There is still uncertainty about the impact of automation. Kallum Pickering, senior economist at Berenberg, suggests that ‘ever since the First Industrial Revolution we have been replacing humans with machine labor to raise the efficiency of production . . .’ and that the key point is that ‘most of the workers did not end up jobless and poorer. Instead such workers found other work and ways to generate supply’.11
As routine jobs become automated and disappear, it has been suggested that a whole new body of jobs will emerge, as human capital is freed up to do new things. ‘As one job is destroyed’, says Pickering, ‘another can be created’.12 That approach, together with lighter regulation and more informal work arrangements provided by so-called zero-hour contracts, may soak up some capacity in the labour market but is unlikely to be a panacea. Retraining will become increasingly critical, as will be the management of career expectations.
For those in mid-career, the fear of unemployment will not only continue as traditional routine roles and career paths disappear but is likely to create greater pressures and dominate their thinking. Finding work will become more and more difficult, especially for those with skills that can be replaced by automation. This phenomenon will not just be confined to low-skilled work. Even the work of a high-level financial advisor can be automated.13
Self-help books continually suggest that finding work needs to be dealt with as if the task is a job itself, albeit an unpaid and sometimes thankless one. It's no surprise therefore that many relate unemployment to depression. At the least, victims may suffer from unsettled sleep patterns coupled with the desire for comfort food – so-called binge eating. At the worst, problems of alcoholism, domestic violence, and an increased risk of suicide can result.14
Long-term unemployment makes the problem more acute. Only 1 in 10 of the long-term unemployed find work. Assuming the right skills and an active marketplace, about 3 in 10 people are able to find work in the first few weeks after losing their jobs. But after about a year of being out of work, the chances of landing a job fall to just 1 in 10 per month. Those who come from industries that are downsizing, or whose skills are outdated, can find it much more difficult.15
Aside from the issue of capability and economic opportunity, there is the issue of ageism. Unemployment levels of those over 55 are masked by rising retirement ages. Whilst some might argue that this aging demographic needs to stand aside for younger, perhaps more dynamic, employees, there are many in this group who believe that they still have much to give. In 2015 the UK government launched an older workers champion scheme aimed at over-55 job seekers, and suggested three key benefits of employing those in this demographic:16
Economic improvement
. If the 1.2 million workless people over age 50 in the United Kingdom who wanted a job got back to work, this could add around £50 billion to the UK economy.
Mentoring
. In workplaces that employ people aged 55–64 there is a positive effect on the performance of their younger counterparts.
Stability
. Those aged 50–64 have an average job tenure of 13 years, compared with 7 years for those aged 25–49.
Loss of employment for the older generation is nothing new. The reengineering approach of the 1980s and 1990s often focused on those who were older and more expensive. This mentality has permeated into modern day recruitment, which (whilst constrained by legal remedies for age discrimination as well as gender) often shows itself through early deselection of those with a lengthy CV or evidence of qualifications in the 1970s and 1980s.
Few will have the luxury of simply doing nothing and resting on their laurels (or pension provisions). Whilst still remaining influenced by the work ethic, but also to supplement limited incomes and avoid boredom, many professionals at the end of their professional careers are increasingly adopting a two-pronged approach:
They are looking towards a portfolio approach to work and are prepared to undertake a variety of different, but perhaps complementary, roles.
The more entrepreneurial are prepared to align themselves with smaller companies and start-ups, perhaps for an equity share rather than salaried employment.
Some who are at the end of their traditional careers are deciding to invest in their own big ideas, pouring their own money into their development or marketing. Often this is to no avail, as they often hit a roadblock with respect to funding. Venture capitalists on the whole appear reluctant to support aging entrepreneurs.
According to the Harvard Business Review, which looked at the Billion Dollar Club (companies valued at US$1 billion by venture capitalists):
The average age of those founding a start-up was just over 31.
Founders under the age of 35 represent a significant proportion of founders in the Billion Dollar Club.
CEOs and presidents are 42 years old on average, with a median age of 42.
Table 3