50,99 €
A PROVEN APPROACH FOR CREATING and IMPLEMENTING EFFECTIVE GOVERNANCE for DATA and ANALYTICS Financial Institution Advantage and the Optimization of Information Processing offers a key resource for understanding and implementing effective data governance practices and data modeling within financial organizations. Sean Keenan--a noted expert on the topic--outlines the strategic core competencies, includes best practices, and suggests a set of mechanisms for self-evaluation. He shows what it takes for an institution to evaluate its information processing capability and how to take the practical steps toward improving it. Keenan outlines the strategies and tools needed for financial institutions to take charge and make the much-needed decisions to ensure that their firm's information processing assets are effectively designed, deployed, and utilized to meet the strict regulatory guidelines. This important resource is filled with practical observations about how information assets can be actively and effectively managed to create competitive advantage and improved financial results. Financial Institution Advantage and the Optimization of Information Processing also includes a survey of case studies that highlight both the positive and less positive results that have stemmed from institutions either recognizing or failing to recognize the strategic importance of information processing capabilities.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 344
Veröffentlichungsjahr: 2015
Cover
Wiley & Sas Business Series
Title Page
Copyright
Dedication
Introduction
Acknowledgments
Chapter 1: Financial Institutions as Information Processors
Financial Institutions' Raison d'Être
Cultural Issues
IT Literacy and the Spreadsheet Deluge
Other Challenges to Establishing an IT-Savvy Culture
Notes
Chapter 2: Strategic Hardware and Software Management
Overview
An Integrated Data Architecture
Information Processing Efficiency as an Institutional Objective
A Digression on Unstructured Data
Notes
Chapter 3: Data, Models, and Information
Model Risk Mania
Definitions and Epistemology
Data Quality
Models and Their Role
Regulatory Regimes and Guidance
Notes
Chapter 4: Model Risk Measurement
Three Phases of Model Management
Model Governance
Defining Model Risk
Objectifying the Downside
Model Risk Attribution: An Information Entropy Approach
Notes
Chapter 5: The Return on Analytic Assets
Measuring the Productivity of Models
Complementarity of Data Inflow with Information Processing
A Digression on Price Taking
Notes
Chapter 6: Data Risk Measurement
Strategic Data Acquisition
The Information Conversion Rate
Other Approaches for Data Risk Assessment
Notes
Chapter 7: A Higher Level of Integration
Alternate Views of Integration
Identifying Key Information Cycles
An Integrated Physical View
Multidimensional Information Asset Management
Chapter 8: A Strategy for Optimizing the Information Processing Complex
Evaluation
A Path toward Improvement
Notes
Chapter 9: Case Studies
The Pricing of Automobile Insurance
Moody's KMV
The London Whale
The Mortgage-Backed Securities Disaster
The Value of Annuities
Notes
Chapter 10: Conclusions
References
About the Author
index
End User License Agreement
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
138
139
140
141
142
143
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
195
196
197
198
199
200
201
202
203
204
205
206
207
208
Cover
Table of Contents
Begin Reading
Figure 1.1
Figure 2.1
Figure 2.2
Figure 2.3
Figure 2.4
Figure 3.1
Figure 3.2
Figure 4.1
Figure 4.2
Figure 4.3
Figure 4.4
Figure 4.5
Figure 6.1
Figure 7.1
Figure 7.2
Figure 7.3
Figure 7.4
Figure 9.1
Table 1.1
Table 2.1
Table 2.2
Table 4.1
Table 9.1
Table 9.2
The Wiley & SAS Business Series presents books that help senior-level managers with their critical management decisions.
Titles in the Wiley & SAS Business Series include:
Analytics in a Big Data World: The Essential Guide to Data Science and its Applications
by Bart Baesens
Bank Fraud: Using Technology to Combat Losses
by Revathi Subramanian
Big Data Analytics: Turning Big Data into Big Money
by Frank Ohlhorst
Big Data, Big Innovation: Enabling Competitive Differentiation through Business Analytics
by Evan Stubbs
Business Analytics for Customer Intelligence
by Gert Laursen
Business Intelligence Applied: Implementing an Effective Information and Communications Technology Infrastructure
by Michael Gendron
Business Intelligence and the Cloud: Strategic Implementation Guide
by Michael S. Gendron
Business Transformation: A Roadmap for Maximizing Organizational Insights
by Aiman Zeid
Connecting Organizational Silos: Taking Knowledge Flow Management to the Next Level with Social Media
by Frank Leistner
Data-Driven Healthcare: How Analytics and BI are Transforming the Industry
by Laura Madsen
Delivering Business Analytics: Practical Guidelines for Best Practice
by Evan Stubbs
Demand-Driven Forecasting: A Structured Approach to Forecasting, Second Edition
by Charles Chase
Demand-Driven Inventory Optimization and Replenishment: Creating a More Efficient Supply Chain
by Robert A. Davis
Developing Human Capital: Using Analytics to Plan and Optimize Your Learning and Development Investments
by Gene Pease, Barbara Beresford, and Lew Walker
The Executive's Guide to Enterprise Social Media Strategy: How Social Networks Are Radically Transforming Your Business
by David Thomas and Mike Barlow
Economic and Business Forecasting: Analyzing and Interpreting Econometric Results
by John Silvia, Azhar Iqbal, Kaylyn Swankoski, Sarah Watt, and Sam Bullard
Foreign Currency Financial Reporting from Euros to Yen to Yuan: A Guide to Fundamental Concepts and Practical Applications
by Robert Rowan
Harness Oil and Gas Big Data with Analytics: Optimize Exploration and Production with Data Driven Models
by Keith Holdaway
Health Analytics: Gaining the Insights to Transform Health Care
by Jason Burke
Heuristics in Analytics: A Practical Perspective of What Influences Our Analytical World
by Carlos Andre Reis Pinheiro and Fiona McNeill
Human Capital Analytics: How to Harness the Potential of Your Organization's Greatest Asset
by Gene Pease, Boyce Byerly, and Jac Fitz-enz
Implement, Improve and Expand Your Statewide Longitudinal Data System: Creating a Culture of Data in Education
by Jamie McQuiggan and Armistead Sapp
Killer Analytics: Top 20 Metrics Missing from your Balance Sheet
by Mark Brown
Predictive Analytics for Human Resources
by Jac Fitz-enz and John Mattox II
Predictive Business Analytics: Forward-Looking Capabilities to Improve Business Performance
by Lawrence Maisel and Gary Cokins
Retail Analytics: The Secret Weapon
by Emmett Cox
Social Network Analysis in Telecommunications
by Carlos Andre Reis Pinheiro
Statistical Thinking: Improving Business Performance,
second edition by Roger W. Hoerl and Ronald D. Snee
Taming the Big Data Tidal Wave: Finding Opportunities in Huge Data Streams with Advanced Analytics
by Bill Franks
Too Big to Ignore: The Business Case for Big Data
by Phil Simon
The Value of Business Analytics: Identifying the Path to Profitability
by Evan Stubbs
The Visual Organization: Data Visualization, Big Data, and the Quest for Better Decisions
by Phil Simon
Understanding the Predictive Analytics Lifecycle
by Al Cordoba
Using Big Data Analytics: Turning Big Data into Big Money
by Jared Dean
Win with Advanced Business Analytics: Creating Business Value from Your Data
by Jean Paul Isson and Jesse Harriott
For more information on any of the above titles, please visit www.wiley.com.
Sean C. Keenan
Cover image: tech background: © iStock.com/cherezoff, social image: © iStock.com/Henrik5000
Cover design: Wiley
Copyright © 2015 by John Wiley & Sons, Inc. All rights reserved.
Published by John Wiley & Sons, Inc., Hoboken, New Jersey.
Published simultaneously in Canada.
No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 646-8600, or on the Web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permissions.
Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.
Wiley publishes in a variety of print and electronic formats and by print-on-demand. Some material included with standard print versions of this book may not be included in e-books or in print-on-demand. If this book refers to media such as a CD or DVD that is not included in the version you purchased, you may download this material at http://booksupport.wiley.com. For more information about Wiley products, visit www.wiley.com.
Library of Congress Cataloging-in-Publication Data:
Keenan, Sean C., 1961-
Financial institution advantage & the optimization of information processing / Sean C. Keenan.
pages cm. – (Wiley & SAS business series)
Includes bibliographical references and index.
ISBN 978-1-119-04417-8 (cloth); ISBN 978-1-119-05304-0 (ebk); ISBN 978-1-119-05322-4 (ebk)
1. Financial services industry–Data processing. 2. Financial services industry–Information technology. 3. Financial institutions–Management. I. Title. II. Title: Financial institution advantage and the optimization of information processing.
HG173.K397 2015
332.10285–dc23
2014041589
For Sage & Coleman.
At its most basic level, a financial institution is composed of four things: a brand, a collection of personnel, some physical assets, and analytic (information) assets. The last category includes things like data, data processing capabilities, statistical models of various kinds, and other analytic and reporting capabilities. This categorical breakdown is simplistic, and not exactly clean. For example, there is an overlap between physical assets and data processing capabilities: Are the computers themselves physical assets or information assets? Overlap also exists between personnel and analytic methods: Does a buy or sell decision stem from an analytic method or from a person who makes buy and sell decisions? In spite of this lack of clarity, using this categorization—even in its most simplistic form—can help to frame the crucial underlying competitive issues facing financial institutions today. These issues can be summarized as follows:
If you have a strong brand, great, try to preserve it. If not, try to build one. But how?
If you have great personnel, great, try to retain them. If not, try to attract them. But how?
Physical assets are highly fungible, depreciate rapidly, and matter little, except insofar as they contribute to brand strength and the ability to attract and retain talent.
Information assets, actively and effectively managed, create competitive advantages and improved financial results. This helps to build brand strength and attract top talent.
Under this simple view, a financial institution that wants to be more competitive and more successful needs to focus assiduously on more effective management of information assets, including data acquisition and information processing. The goal of this book is not to describe the ideal state for any particular aspect of any business process within an actual financial institution. Rather, its goal is to suggest a prioritization of certain capabilities as critical strategic core competencies, provide some thoughts about better (if not best) practices, and to suggest a set of mechanisms for self-evaluation. In other words, how does an institution evaluate its information processing capability and take practical steps toward improving it?
Nearly every month the media report cases of major blunders by financial institutions in trading, reporting of financial information, and mishandling of customer information, along with censures from regulators caused by failures in data management or information processing. While these high-profile events may be signaling something about the capabilities of specific firms or about the average level of capability within the industry as a whole (raising concerns about the potential frequency of future costly gaffes), the underlying issue is not about the cost of isolated blunders. Instead, it is about the efficiency and effectiveness of the tens of thousands of tasks that financial institutions need to perform every day in order to earn their right to exist. The deeper question that ought to be asked by investors, managers, and other market participants is how well can these institutions develop, market, and manage financial products and services relative to their peers, given that these activities are critically dependent on information processing capabilities?
Importantly, financial institutions need not only be concerned about direct competition from more capable peers. They also need to be concerned about encroachment from more capable firms in tangential or even unrelated industries. One obvious threat is from firms whose core competency is squarely in Big Data management and information processing generally. These would include firms like Amazon, Yahoo, and Google, but even firms with other closely related strengths, such as logistics, can be threats to financial institutions that fall behind. For a powerful example, see “Wal-Mart Dives Deeper in Banking,” Wall Street Journal, April 18, 2014. To cite another example, Facebook now boasts more than 1.3 billion customers (it reported it had 20 million in 2007 and 200 million in 2009), and it is said that the company has more information about its customer base than any firm in history. How difficult would it be for Facebook, assuming it was committed to that strategy, to launch Facebank? And how might that development further change the competitive landscape for financial services? The answer to the former question may already be known. According to American Banker, an Accenture-conducted survey in March of 2014 of 3,846 bank customers in North America revealed that:
Almost half (46 percent) of consumers aged 18 to 34 said that if PayPal offered banking services, they would want to use them. About 40 percent said the same about Google and 37 percent favored Amazon… AlixPartners asked 1,249 smartphone and tablet-using customers at the end of 2013 which providers they would most want to use for a digital wallet (defined as a tool that stores payment card numbers and loyalty, gift card, reward, coupon, and discount information). Close to half (46 percent) would want one from PayPal, 19 percent from Google. Half (50 percent) said they would want their primary bank to provide the service.1
At the same time, online microcredit and peer-to-peer lending platforms, which are also capable of eating into bank market share, have been growing and multiplying at a rapid rate. There were reportedly 33 of such platforms that were active in 2010, up from one in 2005 (the first true peer-to-peer online lending platform was Zopa). Summarized by Bachman et al. (2011),
In this kind of lending model the mediation of financial institutions is not required…P2P lending is a way to receive a loan without a financial institution involved in the decision process and might also be a possibility to receive better conditions than in the traditional banking system.2
In 2012 the peer-to-peer online lending industry volume was over $50 million in new loans per month, and in mid-2012 total loan volume passed the $1 billion mark.3 At what level of volume and transaction size, or at what expansion of transactor scope, might these platforms be in a position to seriously encroach upon traditional financial institutions? And more to the point, what things are these traditional institutions not doing today that is helping to foster the growth of these financial services alternatives?
It is a somewhat puzzling irony that in the financial services sector, corporate leaders who are otherwise bold and self-confident, and whose success is founded on their ability to make daring, large, long-range decisions for the firm seem all too often unable or unwilling to make similarly bold and similarly important decisions about how their firm's information processing assets are designed, deployed, and utilized. Antiquated and patchwork data systems, along with obsolete and feature-starved process applications, can seriously undermine the competitive position of a financial institution. In many cases in which bold, long-range planning decisions are desperately needed, institutions fail to prioritize major improvements to their information processing capabilities, and this failure to prioritize can be the primary constraint on progress toward a more holistic and capable information processing infrastructure. Evidently, as we will argue, in this industry so dependent on superior information processing, institutions seem to be weak in assessing where the current investment trade-offs are, where they are headed, and how fast they are changing. One symptom of this state of affairs is the enormous difficulty and expense that firms have experienced in trying to meet post-crisis regulatory requirements, such as Comprehensive Capital Analysis and Review (CCAR). Replacing antiquated legacy capabilities and taking a more deliberate and holistic approach to information processing means not only restructuring or replacing physical data processing and analytic resources, it also means creating an organizational structure to match that modernized business model. This means that the overall strategic direction must be identified, that the underlying physical infrastructure must be aligned with that vision, and that a plan to match personnel with that model must be developed and communicated throughout the organization.
This book seeks to provide context, as well as analytic and anecdotal support, for the simple characterization described above—that to be more successful and more competitive, financial institutions need to focus on information processing as the core competency. It seeks to provide some organization and definitions of terms and ideas embedded in the concept of management of information assets, primarily surrounding data and information processing and statistical modeling. The goal of this exercise is to make the relationship between firm organization around these functions and overall firm strategy and performance more stark. Finally, the book provides practical observations about how information assets can be actively and effectively managed to create competitive advantage and improved financial results. Toward the end of the book we survey some case studies that highlight some of the positive and less positive results that have stemmed from institutions either recognizing or failing to recognize the strategic importance of information processing capabilities.
1
Penny Crosman, “How Banks Can Win Back ‘Mind Share’ from PayPal, Google, Amazon,”
American Banker
, May 30, 2014, 10.
2
Alexander Bachman, Alexander Becker, Daniel Buerkner, Michael Hilker, Frank Kock, Mark Lehmann, and Phillip Tiburtius, “Online Peer-to-Peer Lending—A Literature Review,”
Journal of Internet Banking and Commerce
16, no. 2 (August 2011).
3
Peter Renton, “Peer-to-Peer Lending Crosses $1 Billion in Loans Issued,” TechCrunch (website), May 29, 2012,
http://techcrunch.com/2012/05/29/peer-to-peer-lending-crosses-1-billion-in-loans-issued
.
Those who deserve special thanks include Brian Peters, Karen Schneck, Andrew Clyne, Mark Almeida, Hsiu-Mei Chang, and Gordon Cooper for various contributions, edits, and moral support; Jorge Sobehart for teaching me about information entropy and many other things; and to all my wonderful and brilliant colleagues at AIG. And to Sarah Kate Venison who provided encouragement all along the way. The multitude of remaining errors and defects are of course my own.
Economic literature includes a rich debate on why firms exist as they do—the main question being why firm boundaries are defined in the ways that we observe. Certain types of activities that could remain in-house are routinely outsourced, while many activities with the potential to be outsourced remain internal to the firm. Mergers, acquisitions, and divestitures do exhibit certain patterns with respect to how firms believe their own boundaries ought to be defined, but these patterns are by no means exhaustive nor are their outcomes obviously probative. Some corporate restructurings are metamorphic and highlight the question of what makes a financial institution a financial institution. For example, in 1987 Greyhound Corp., a bus line company since 1929, spun off its bus line operating units so that it could “focus on its core business of financial services.” To even think about which firms should be defined as belonging to the financial services sector we need to have some practical mechanism or criteria for inclusion. Theoretically we could simply enumerate a comprehensive list of financial services and products, and include firms that engage in this set of activities. With a boundary so constructed, we would have an identified set of institutions to analyze. But does that boundary really exist or is it helpful even as an abstraction? Retail sales finance is one of the largest and most obvious types of boundary blurring, often occurring at the direct expense of banks and retail credit suppliers. Captive finance subsidiaries for manufacturing firms are also common and the obvious complementarity between manufacturing goods and financing their sale seems to suggest that the latter function can be effectively internalized. But while the economic incentive to encroach on the boundaries of financial services seems to be predominantly one way—that is, we have not heard of things like mortgage institutions directly engaging in home construction—no hard and fast rule seems to apply.
There are well-known cases of captive finance companies whose financial services activities grew beyond financing the parent's manufactured products—in one case so much so that the entity became a systemically significant financial institution in its own right with only remnant relationships between their financing activities and the financing of the parent's products. Are there economic principles that would allow us to explain why, and the extent to which (for example) auto sales and lease financing are or are not more thoroughly internalized within auto manufacturers? While to economists the answer is surely yes (what area of human endeavor do economists feel cannot be explained by economics?), it seems clear that management teams at financial institutions themselves do not recognize or embrace such principles. For if they believed they understood the principles that define why the financial institution exists, they would surely leverage those same principles to establish firms that function better overall.
Rather than try to tackle this broader problem head on, in this book we simply focus on the kinds of firms that dominate the financial services industry landscape: banks and insurance companies. We leave it to the reader to consider whether or not the observations made also apply to any specific firm or subset of firms with financial sector exposure or activities. A number of factors characterize the financial services industry in a way that might help us better understand why financial institutions exist in the way they do, and how they can improve their economic strength and competitive positions.
Over the bulk of the financial industry's long history, practical barriers to entry in banking and insurance were quite high. In the modern era this was primarily due to regulatory and licensing requirements, but also due to consumer preferences for brand stability and stature. Over the past 100 years or so great banking and insurance industry firms were founded on brand strength, and their ability to attract depositors and policy holders was their primary determinant of growth. However, those barriers began to erode during the twentieth century as cultural changes and an increasing dependence on technology changed both the supply and demand sides of financial services markets. Changing regulatory requirements produced periods that alternated between stimulating and dampening bank and insurance company formation as well as merger activity, which is beyond the scope of this book to either document or survey. What is important is that evidence can be presented to support the claim of low barriers to entry.
Interestingly, the aggregate data does not show an upward trend in the number of operating financial institutions. For banks, the total number of operating institutions in the United States hovered around 14,000 for the nearly 20 years between the early 1960s and the early 1980s. Then, after the savings and loan crisis began to unfold, the total number of banks began to drop—a trend that continues to this day, with the number of banks dropping by more than 50 percent from its 1980s total to fewer than 6,000 in 2013 (see Figure 1.1). However, looking only at the total number of institutions does not tell the whole story. In particular, the stability of the total number of institutions during that 20-year period between the 1960s and the 1980s reflected an offset between periods of great consolidation through mergers and acquisitions that reduced the total and periods of rapid entry of new institutions—particularly savings and loans associations, prior to the S&L crisis. Overall, entry into the banking sector has remained brisk and steady, despite the stable, then declining, count totals. Hubert Janicki and Edward Prescott observed that, “Despite the large number of banks that have exited the industry over the last 45 years, there has been a consistent flow of new bank entries,” and calculated the average annual entry rate at about 1.5 percent of operating banks. The authors further observe that, “It is striking that despite the huge number of bank exits starting in the 1980s, entry remained strong throughout the entire period. Interestingly, it is virtually uncorrelated with exit. For example, the correlation between exit and entry for the 1985–2005 period is only –0.07.”1
Figure 1.1 Total Commercial Banks in the United States
Source: Federal Reserve Economic Data (FRED); Federal Reserve Bank of St. Louis.
Janicki and Prescott also observe how market share can shift dramatically. They note that of the top ten banks in 1960 (by asset size), only three are still in the top ten.
Part of this is due to M&A (mergers and acquisitions) activity. But part of it reflects the fact that the product and service sets, based on intermediation and disintermediation and risk pooling, are technical in nature, and as trends in the underlying technologies change, firms have a great opportunity to innovate effectively and gain market share, or fail to innovate effectively and lose market share. What Figure 1.1 does show clearly is that while barriers to entry may be low, barriers to exit are even lower. Failure to stay abreast of technological innovations, as well as the adoption of so-called innovations that misrepresent true risk-adjusted returns, has been causing the number of operating banks to shrink by about 280 per year since 1984. Some of the innovations that led to distorted risk assessments include mortgage-backed securities and complex, illiquid types of derivatives (there are others). But while distorted risk assessments have historically been blamed on personal mismanagement and a culture of greed, these explanations offer little in the way of economic underpinnings and cannot explain the disappearance of nearly 8,600 banks over a 30-year period. The main culprit is that decision makers within these firms have been provided with poor information, insufficient information, and, in many cases, misinformation, and the main cause for this is that these firms were manifestly poor at information management and creation. While the S&L crisis triggered the largest number of bank closures, the bursting of the tech and housing bubbles, and the ensuing liquidity crisis of 2008, also forced many institutions to close. And in far more cases, institutions that did not fail saw their profitability greatly reduced by inefficiencies, losses, and fines—most of which could have been avoided with the appropriate amount of investment in system architecture and process redesign. Recent examples of significant regulatory fines related to information processing failures include:
$25 billion: Wells Fargo, JPMorgan Chase, Citigroup, Bank of America, Ally Financial (2012)
$13 billion: JPMorgan Chase (2013)
$9.3 billion: Bank of America, Wells Fargo, JPMorgan Chase, and 10 others (2013)
$8.5 billion: Bank of America (June 2011)
$2.6 billion: Credit Suisse (May 2014)
$1.9 billion: HSBC (2012)
$1.5 billion: UBS (2012)
Taken together, these historical facts show how even very large financial institutions can suffer or even cease to exist if they fail to embrace technological innovation, or embrace it without a commensurate investment in the information management capability required to effectively evaluate risk. Thus, the stylized facts that should concern current financial institutions are:
Firms entering the market, particularly those entering with some technological advantage, are a threat.
Excessive risk taking based on impaired risk assessments (often the result of technological innovation without the supporting information flow) is a threat.
The likelihood that any firm succumbing to these threats will be expelled from the market is high.
Poor information management itself has causes. In some cases, the underlying causes may have included a regulatory (and rating agency) arbitrage in which financial institutions were incented to do the minimum while benefiting from things like deposit insurance (an explicit stamp of approval from regulatory authorities) and high public ratings from rating agencies, or even the implicit stamp of approval that comes purely from compliance and the absence of regulatory censure. But more importantly and more generally, low industry standards for excellence in information processing have meant the absence of competitive pressures to innovate and excel. This environment, which has persisted for decades, is now coming to an end.
While identifying more effective management of information assets as a key strategic objective for the firm is a good first step, implementing an effective strategic management process is not without challenges within a modern financial institution. Among those are serious cultural and organizational challenges that can work against the development and deployment of an integrated approach to information management. One such challenge is so pervasive and so constraining that it deserves special consideration. Within the broader fabric of corporate culture, there lies a deep cultural rift—a rift that may be more or less pronounced depending on the business mix and particular firm characteristics, but that is almost always material. It is the rift between IT (alternately management information systems, or MIS) and non-IT. This rift has developed over decades, with rapid technological change and exponentially increasing business dependencies on technology as the driving forces. Importantly, the initials IT stand for information technology—something that should be a core competency for a financial institution. But far from being core from an integrated strategic management perspective, business managers and their IT counterparts are often separated culturally to such an extent that they are speaking different languages, both euphemistically and literally. Business executives frequently view their IT organizations with distrust. Common complaints are that their process requirements are opaque, that they do not understand the organization's business objectives, or, worst of all, that they are not motivated by incentives that are aligned with the business strategy.2 On the other side, IT personnel often hold a dim view of the non-IT businessperson's understanding of technology generally, and IT technology in particular. The IT presumption that the business side doesn't understand its own problem, doesn't understand what the solution should be, or simply can't express itself intelligibly, can easily lead to ill-formed plans and projects whose poor outcomes further the distrust, in addition to sapping the resources of the firm. Importantly, the rift reflects the fact that information processing is not viewed as a true core competency within most financial institutions, and that consequently IT is seen as a supporting, or enabling, function—critical yes, but no more so than operating an effective health benefits program (or company cafeteria, for that matter).
Senior leadership positions such as chief financial officer or chief credit officer are typically viewed not only as great executives but also as repositories of subject matter expertise and corporate history. The people who hold such positions are expected to understand the entire fabric of their respective organizations thoroughly and often are expected to have personal experience at multiple levels of job seniority. Chief credit officers will invariably have had deep experience in underwriting and workouts over a range of products and markets. Chief financial officers will usually have had deep hands-on experience in preparing and analyzing financial statements, and frequently in auditing financial accounts from different parts of the company. Unfortunately, this deep experience in their respective disciplines is a double-edged sword. As the needs within risk and finance become increasingly dependent on analytics and information processing, these leaders may not have the experience or vision to help shape the data and analytic infrastructure of the firm to enable competitive capabilities to be developed in these key areas.
Contrast this with the chief information officer, chief technology officer, or whatever the C-level executive responsible for IT is called. These leaders are responsible for establishing a forward-looking competitive infrastructure design and overall vision for the firm, and because their peer-level leaders may not have comparable technical depth, that responsibility may be very highly concentrated. As individuals, they have frequently distinguished themselves in general manager roles or within a specific discipline other than IT. But even for those with relatively deep or long-tenured association with IT, how many in the financial services industry actually rose up from within the IT culture? How many have ever personally designed a software application and seen it through each phase of the development process? How many have personally developed a major data processing system? How many have written a single line of production code? Certainly, outside the financial services industry—and not just in the technology space—the answer to all these questions would be: the majority. However, the honest answer within the financial services industry has to be: very few. This shows both the general lack of interest senior managers have in actively managing their companies as information processing companies, and the reason that financial institutions are so challenged by the basic needs and competitive demands that they currently face in this area.
For corporate leaders not directly responsible for IT, the acceleration of technological change within their respective disciplines has been more recent but the vintage effect of the experience base is no less pronounced. For example, 10 years ago almost all chief compliance officers were attorneys and most reported to the general counsel. As compliance risks and regulatory attention evolved toward more systemic information-based areas such as anti-money laundering (AML), customer identification programs (CIP), reporting covered under the Bank Secrecy Act of 1970 (BSA), and other types of fraud detection, more technical, risk-related training has become increasingly important. Within the AML space (now a front-burner issue, especially for larger institutions), detection solutions are increasingly based on sophisticated statistical modeling and voluminous data processing. Top-vendor AML systems deploy sophisticated models that require not only expert management and independent validation, but also rich and timely data flows that can test the capability of the institution's overall data infrastructure. Even CIP, formerly a rule-based exercise with a tendency toward weak performance measurement, continues to evolve in this direction. The Patriot Act clarification on CIP includes this statement:
The Agencies wish to emphasize that a bank's CIP must include risk-based procedures for verifying the identity of each customer to the extent reasonable and practicable. It is critical that each bank develop procedures to account for all relevant risks including those presented by the types of accounts maintained by the bank, the various methods of opening accounts provided, the type of identifying information available, and the bank's size, location, and type of business or customer base. Thus, specific minimum requirements in the rule, such as the four basic types of information to be obtained from each customer, should be supplemented by risk-based verification procedures, where appropriate, to ensure that the bank has a reasonable belief that it knows each customer's identity.3
To summarize, regulators now expect financial institutions to bear the full weight of modern data management and creative, advanced analytics in addressing issues for which compliance had traditionally been a matter of minimally following highly prescriptive rule sets. Given the emphasis on risk-based techniques requiring advanced, industrial strength data processing support, chief risk officers and chief compliance officers will be challenged to lead these efforts without a technical risk-analytics and IT-oriented experience base.
Unfortunately for many firms, the problem of an inadequate experience base is self-reinforcing. How many stories have we heard about giant IT projects that were catastrophic failures? Without adequately experienced leaders in place (who could potentially prevent some of these disasters), it can be extremely difficult to get accurate assessments of why the projects failed or what could have been done better. The experience deficit has also created an information asymmetry in which business decision makers, often not completely clear about what their current and future needs are and scarred by past IT project failures, are squared off against software vendors who are often very well informed about the firm's knowledge, current capabilities, and history, and can tailor their sales pitches accordingly. Ironically, many large-scale IT failures occurred because the projects weren't nearly large enough—that is, as big as they may have been, they weren't part of a holistic redesign of the overall information processing infrastructure of the firm. At the same time, many IT-related outsourcing relationships have helped financial institutions improve performance and efficiency, creating a tremendously appealing perception that more outsourcing is better, and that financial institutions need to get out of the information processing business. But as we will discuss in more detail below, institutions need to consider carefully what aspects of their information-process complex are truly core to their identities and competitive positions in the marketplace, and invest in and further develop these internal capabilities instead of outsourcing them.
Outsourcing issues aside, all IT infrastructure projects expose the firm to some risk. In the absence of a clearly communicated overall vision, the risks associated with piecemeal infrastructure projects are elevated for a number of reasons. In the first place, even well-meaning and experienced project managers are at an informational disadvantage. They are solving a problem—or a narrow set of problems—without knowing whether the design choice will be complementary to other software and system projects also underway. The only way to insure strong complementarity of such projects is to have a clearly articulated vision for the overall system and to evaluate each project for consistency with that vision. For many large institutions, particularly those who have grown by acquisition, the underlying system is effectively a hodgepodge, and there may be no clearly articulated vision. Under these conditions, the chance that any one project will make the problem worse is high. This can lead decision makers to embrace min-max strategies4 with respect to high-visibility infrastructure projects—often strategies that can be supported with information from industry experts, including consultants and the software vendors themselves, who certainly do not have a long-term vision for the firm's competitive position as a goal. In many cases, both the requirements for a given infrastructure build and the design choices made in order to meet those requirements are partly or wholly outsourced to vendors, consultants, or both. From asset/liability management systems, to Basel II/III systems, to AML systems, to model governance systems, to general purpose database and data processing systems, key expertise and decision making are routinely outsourced. Recognizing this, the sales presentations from the major software firms increasingly involve selling the vendor-as-expert, not just the product. Consulting firms, too, have increasingly oriented their marketing strategies toward this approach under the (frequently correct) assumption that the audience is operating on the short end of an information asymmetry, understanding primarily that it has a problem and needs a solution. Vendors' increasing focus on integrated solutions reflects their perception that institutions are now aware that they have bigger problems and are increasingly willing to outsource the vision for how the firm manages its analytic assets in the broadest sense.
