Quantitative Equity Investing - Frank J. Fabozzi - E-Book

Quantitative Equity Investing E-Book

Frank J. Fabozzi

0,0
60,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

A comprehensive look at the tools and techniques used in quantitative equity management Some books attempt to extend portfolio theory, but the real issue today relates to the practical implementation of the theory introduced by Harry Markowitz and others who followed. The purpose of this book is to close the implementation gap by presenting state-of-the art quantitative techniques and strategies for managing equity portfolios. Throughout these pages, Frank Fabozzi, Sergio Focardi, and Petter Kolm address the essential elements of this discipline, including financial model building, financial engineering, static and dynamic factor models, asset allocation, portfolio models, transaction costs, trading strategies, and much more. They also provide ample illustrations and thorough discussions of implementation issues facing those in the investment management business and include the necessary background material in probability, statistics, and econometrics to make the book self-contained. * Written by a solid author team who has extensive financial experience in this area * Presents state-of-the art quantitative strategies for managing equity portfolios * Focuses on the implementation of quantitative equity asset management * Outlines effective analysis, optimization methods, and risk models In today's financial environment, you have to have the skills to analyze, optimize and manage the risk of your quantitative equity investments. This guide offers you the best information available to achieve this goal.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 766

Veröffentlichungsjahr: 2010

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents
The Frank J. Fabozzi Series
Title Page
Copyright Page
Dedication
Preface
TEACHING USING THIS BOOK
ACKNOWLEDGMENTS
About the Authors
CHAPTER 1 - Introduction
IN PRAISE OF MATHEMATICAL FINANCE
STUDIES OF THE USE OF QUANTITATIVE EQUITY MANAGEMENT
CHAPTER 2 - Financial Econometrics I: Linear Regressions
HISTORICAL NOTES
COVARIANCE AND CORRELATION
Estimation of the Covariance and Correlation Coefficient
Estimation Issues
Random Matrix Theory
REGRESSIONS, LINEAR REGRESSIONS, AND PROJECTIONS
Estimation of the Regression Coefficients
MULTIVARIATE REGRESSION
Seemingly Unrelated Regressions
QUANTILE REGRESSIONS
REGRESSION DIAGNOSTIC
ROBUST ESTIMATION OF REGRESSIONS
CLASSIFICATION AND REGRESSION TREES
SUMMARY
CHAPTER 3 - Financial Econometrics II: Time Series
STOCHASTIC PROCESSES
TIME SERIES
Representation of Time Series
Invertibility and Autoregressive Representations
Representation in the Frequency Domain
Errors and Residuals
STABLE VECTOR AUTOREGRESSIVE PROCESSES
INTEGRATED AND COINTEGRATED VARIABLES
ESTIMATION OF STABLE VECTOR AUTOREGRESSIVE (VAR) MODELS
Vectoring Operators and Tensor Products
Multivariate Least Squares Estimation
The Asymptotic Distribution of LS Estimators
ESTIMATING THE NUMBER OF LAGS
AUTOCORRELATION AND DISTRIBUTIONAL PROPERTIES OF RESIDUALS
STATIONARY AUTOREGRESSIVE DISTRIBUTED LAG MODELS
ESTIMATION OF NONSTATIONARY VAR MODELS
Estimation of a Cointegrated VAR with Unrestricted LS Methods
ML Estimators
Estimating the Number of Cointegrating Relationships
Ml Estimators in the Presence of Linear Trends
ESTIMATION WITH CANONICAL CORRELATIONS
ESTIMATION WITH PRINCIPAL COMPONENT ANALYSIS
ESTIMATION WITH THE EIGENVALUES OF THE COMPANION MATRIX
NONLINEAR MODELS IN FINANCE
Clustering Models
Regime Shifting Models
Models of Irregularly Spaced Data
Nonlinear DGP Models
CAUSALITY
SUMMARY
CHAPTER 4 - Common Pitfalls in Financial Modeling
THEORY AND ENGINEERING
ENGINEERING AND THEORETICAL SCIENCE
ENGINEERING AND PRODUCT DESIGN IN FINANCE
LEARNING, THEORETICAL, AND HYBRID APPROACHES TO PORTFOLIO MANAGEMENT
SAMPLE BIASES
THE BIAS IN AVERAGES
PITFALLS IN CHOOSING FROM LARGE DATA SETS
TIME AGGREGATION OF MODELS AND PITFALLS IN THE SELECTION OF DATA FREQUENCY
MODEL RISK AND ITS MITIGATION
Sources of Model Risk
The Information Theory Approach to Model Risk
Bayesian Modeling
Model Averaging and the Shrinkage Approach to Model Risk
Random Coefficients Models
SUMMARY
CHAPTER 5 - Factor Models and Their Estimation
THE NOTION OF FACTORS
STATIC FACTOR MODELS
Linear Factor Models
Empirical Indeterminacy of the Model and Factor Rotation
The Covariance Matrix of Observations
Using Factor Models
FACTOR ANALYSIS AND PRINCIPAL COMPONENTS ANALYSIS
Factor Analysis via Maximum Likelihood
The Expectation Maximization Algorithm
The E-Step
The M-Step
Factor Analysis via Principal Components
How to Determine the Number of Factors
WHY FACTOR MODELS OF RETURNS
The Size of Samples and Uniqueness of Factors
APPROXIMATE FACTOR MODELS OF RETURNS
DYNAMIC FACTOR MODELS
Dynamic Factor Models of Integrated Processes
Illustration of Principal Components Analysis
An Illustration of Factor Analysis
SUMMARY
CHAPTER 6 - Factor-Based Trading Strategies I: Factor Construction and Analysis
FACTOR-BASED TRADING
DEVELOPING FACTOR-BASED TRADING STRATEGIES
RISK TO TRADING STRATEGIES
DESIRABLE PROPERTIES OF FACTORS
SOURCES FOR FACTORS
BUILDING FACTORS FROM COMPANY CHARACTERISTICS
WORKING WITH DATA
Data Integrity
Potential Biases from Data
ANALYSIS OF FACTOR DATA
Example 1: EBITDA/EV
Example 2: Revisions
Example 3: Share Repurchase
SUMMARY
CHAPTER 7 - Factor-Based Trading Strategies II: Cross-Sectional Models and ...
CROSS-SECTIONAL METHODS FOR EVALUATION OF FACTOR PREMIUMS
FACTOR MODELS
PERFORMANCE EVALUATION OF FACTORS
MODEL CONSTRUCTION METHODOLOGIES FOR A FACTOR-BASED TRADING STRATEGY
BACKTESTING
BACKTESTING OUR FACTOR TRADING STRATEGY
SUMMARY
CHAPTER 8 - Portfolio Optimization: Basic Theory and Practice
MEAN-VARIANCE ANALYSIS: OVERVIEW
CLASSICAL FRAMEWORK FOR MEAN-VARIANCE OPTIMIZATION
MEAN-VARIANCE OPTIMIZATION WITH A RISK-FREE ASSET
Deriving the Capital Market Line
PORTFOLIO CONSTRAINTS COMMONLY USED IN PRACTICE
ESTIMATING THE INPUTS USED IN MEAN-VARIANCE OPTIMIZATION: EXPECTED RETURN AND RISK
PORTFOLIO OPTIMIZATION WITH OTHER RISK MEASURES
SUMMARY
CHAPTER 9 - Portfolio Optimization: Bayesian Techniques and the Black-Litterman Model
PRACTICAL PROBLEMS ENCOUNTERED IN MEAN-VARIANCE OPTIMIZATION
SHRINKAGE ESTIMATION
THE BLACK-LITTERMAN MODEL
SUMMARY
CHAPTER 10 - Robust Portfolio Optimization
ROBUST MEAN-VARIANCE FORMULATIONS
USING ROBUST MEAN-VARIANCE PORTFOLIO OPTIMIZATION IN PRACTICE
SOME PRACTICAL REMARKS ON ROBUST PORTFOLIO OPTIMIZATION MODELS
SUMMARY
CHAPTER 11 - Transaction Costs and Trade Execution
A TAXONOMY OF TRANSACTION COSTS
LIQUIDITY AND TRANSACTION COSTS
MARKET IMPACT MEASUREMENTS AND EMPIRICAL FINDINGS
FORECASTING AND MODELING MARKET IMPACT
INCORPORATING TRANSACTION COSTS IN ASSET-ALLOCATION MODELS
INTEGRATED PORTFOLIO MANAGEMENT: BEYOND EXPECTED RETURN AND PORTFOLIO RISK
SUMMARY
CHAPTER 12 - Investment Management and Algorithmic Trading
MARKET IMPACT AND THE ORDER BOOK
OPTIMAL EXECUTION
IMPACT MODELS
POPULAR ALGORITHMIC TRADING STRATEGIES
WHAT IS NEXT?
SOME COMMENTS ABOUT THE HIGH-FREQUENCY ARMS RACE
SUMMARY
APPENDIX A
APPENDIX B
APPENDIX C
Index
The Frank J. Fabozzi Series
Fixed Income Securities, Second Edition by Frank J. Fabozzi
Focus on Value: A Corporate and Investor Guide to Wealth Creation by James L. Grant and James A. Abate
Handbook of Global Fixed Income Calculations by Dragomir Krgin
Managing a Corporate Bond Portfolio by Leland E. Crabbe and Frank J. Fabozzi
Real Options and Option-Embedded Securities by William T. Moore
Capital Budgeting: Theory and Practice by Pamela P. Peterson and Frank J. Fabozzi
The Exchange-Traded Funds Manual by Gary L. Gastineau
Professional Perspectives on Fixed Income Portfolio Management, Volume 3 edited by Frank J. Fabozzi
Investing in Emerging Fixed Income Markets edited by Frank J. Fabozzi and Efstathia Pilarinu
Handbook of Alternative Assets by Mark J. P. Anson
The Global Money Markets by Frank J. Fabozzi, Steven V. Mann, and Moorad Choudhry
The Handbook of Financial Instruments edited by Frank J. Fabozzi
Collateralized Debt Obligations: Structures and Analysis by Laurie S. Goodman and Frank J. Fabozzi
Interest Rate, Term Structure, and Valuation Modeling edited by Frank J. Fabozzi
Investment Performance Measurement by Bruce J. Feibel
The Handbook of Equity Style Management edited by T. Daniel Coggin and Frank J. Fabozzi
The Theory and Practice of Investment Management edited by Frank J. Fabozzi and Harry M. Markowitz
Foundations of Economic Value Added, Second Edition by James L. Grant
Financial Management and Analysis, Second Edition by Frank J. Fabozzi and Pamela P. Peterson
Measuring and Controlling Interest Rate and Credit Risk, Second Edition by Frank J. Fabozzi, Steven V. Mann, and Moorad Choudhry
Professional Perspectives on Fixed Income Portfolio Management, Volume 4 edited by Frank J. Fabozzi
The Handbook of European Fixed Income Securities edited by Frank J. Fabozzi and Moorad Choudhry
The Handbook of European Structured Financial Products edited by Frank J. Fabozzi and Moorad Choudhry
The Mathematics of Financial Modeling and Investment Management by Sergio M. Focardi and Frank J. Fabozzi
Short Selling: Strategies, Risks, and Rewards edited by Frank J. Fabozzi
The Real Estate Investment Handbook by G. Timothy Haight and Daniel Singer
Market Neutral Strategies edited by Bruce I. Jacobs and Kenneth N. Levy
Securities Finance: Securities Lending and Repurchase Agreements edited by Frank J. Fabozzi and Steven V. Mann
Fat-Tailed and Skewed Asset Return Distributions by Svetlozar T. Rachev, Christian Menn, and Frank J. Fabozzi
Financial Modeling of the Equity Market: From CAPM to Cointegration by Frank J. Fabozzi, Sergio M. Focardi, and Petter N. Kolm
Advanced Bond Portfolio Management: Best Practices in Modeling and Strategies edited by Frank J. Fabozzi, Lionel Martellini, and Philippe Priaulet
Analysis of Financial Statements, Second Edition by Pamela P. Peterson and Frank J. Fabozzi
Collateralized Debt Obligations: Structures and Analysis, Second Edition by Douglas J. Lucas, Laurie S. Goodman, and Frank J. Fabozzi
Handbook of Alternative Assets, Second Edition by Mark J. P. Anson
Introduction to Structured Finance by Frank J. Fabozzi, Henry A. Davis, and Moorad Choudhry
Financial Econometrics by Svetlozar T. Rachev, Stefan Mittnik, Frank J. Fabozzi, Sergio M. Focardi, and Teo Jasic
Developments in Collateralized Debt Obligations: New Products and Insights by Douglas J. Lucas, Laurie S. Goodman, Frank J. Fabozzi, and Rebecca J. Manning
Robust Portfolio Optimization and Management by Frank J. Fabozzi, Peter N. Kolm, Dessislava A. Pachamanova, and Sergio M. Focardi
Advanced Stochastic Models, Risk Assessment, and Portfolio Optimizations by Svetlozar T. Rachev, Stogan V. Stoyanov, and Frank J. Fabozzi
How to Select Investment Managers and Evaluate Performance by G. Timothy Haight, Stephen O. Morrell, and Glenn E. Ross
Bayesian Methods in Finance by Svetlozar T. Rachev, John S. J. Hsu, Biliana S. Bagasheva, and Frank J. Fabozzi
Structured Products and Related Credit Derivatives by Brian P. Lancaster, Glenn M. Schultz, and Frank J. Fabozzi
Copyright © 2010 by John Wiley & Sons, Inc. All rights reserved.
Published by John Wiley & Sons, Inc., Hoboken, New Jersey. Published simultaneously in Canada.
No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 646-8600, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permissions.
Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993, or fax (317) 572-4002.
Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic books. For more information about Wiley products, visit our web site at www.wiley.com.
Library of Congress Cataloging-in-Publication Data:
Fabozzi, Frank J.
Quantitative equity investing : techniques and strategies / Frank J. Fabozzi, Sergio M. Focardi, Petter N. Kolm ; with the assistance of Joseph A. Cerniglia and Dessislava Pachamanova.
p. cm.—(The Frank J. Fabozzi series)
Includes index.
eISBN : 978-0-470-61752-6
1. Portfolio management. 2. Investments. I. Focardi, Sergio. II. Kolm, Petter N. III. Title. HG4529.5.F.63’2042—dc22
2009050962
FJFTo my wife Donna, and my children Francesco, Patricia, and Karly
SMF To my mother and in memory of my father
PNKTo my wife and my daughter, Carmen and Kimberly, and in memory of my father-in-law, John
Preface
Quantitative equity portfolio management is a fundamental building block of investment management. The basic principles of investment management have been proposed back in the 1950s in the pathbreaking work of Harry Markowitz. For his work, in 1990 Markowitz was awarded the Nobel Memorial Prize in Economic Sciences. Markowitz’s ideas proved to be very fertile. Entire new research areas originated from it which, with the diffusion of low-cost powerful computers, found important practical applications in several fields of finance.
Among the developments that followed Markowitz’s original approach we can mention:
• The development of CAPM and of general equilibrium asset pricing models.
• The development of multifactor models.
• The extension of the investment framework to a dynamic multiperiod environment.
• The development of statistical tools to extend his framework to fat-tailed distributions.
• The development of Bayesian techniques to integrate human judgment with results from models.
• The progressive adoption of optimization and robust optimization techniques.
Due to these and other theoretical advances it has progressively become possible to manage investments with computer programs that look for the best risk-return trade-off available in the market.
People have always tried to beat the market, in the hunt for a free lunch. This began by relying on simple observations and rules of thumb to pick the winners, and later with the advent of computers brought much more complicated systems and mathematical models within common reach. Today, so-called buy-side quants deploy a wide range of techniques ranging from econometrics, optimization, and computer science to data mining, machine learning, and artificial intelligence to trade the equity markets. Their strategies may range from intermediate and long-term strategies, six months to several years out, to so-called ultra-high or high-frequency strategies, at the sub-millisecond level. The modern quantitative techniques have replaced good old-fashioned experience and market insight, with the scientific rigor of mathematical and financial theories.
This book is about quantitative equity portfolio management performed with modern techniques. One of our goals for this book is to present advances in the theory and practice of quantitative equity portfolio management that represent what we might call the “state of the art of advanced equity portfolio management.” We cover the most common techniques, tools, and strategies used in quantitative equity portfolio management in the industry today. For many of the advanced topics, we provide the reader with references to the most recent applicable research in the field.
This book is intended for students, academics, and financial practitioners alike who want an up-to-date treatment of quantitative techniques in equity portfolio management, and who desire to deepen their knowledge of some of the most cutting-edge techniques in this rapidly developing area. The book is written in an almost self-contained fashion, so that little background knowledge in finance is needed. Nonetheless, basic working knowledge of undergraduate linear algebra and probability theory are useful, especially for the more mathematical topics in this book.
In Chapter 1 we discuss the role and use of mathematical techniques in finance. In addition to offering theoretical arguments in support of finance as a mathematical science, we discuss the results of three surveys on the diffusion of quantitative methods in the management of equity portfolios. In Chapters 2 and 3, we provide extensive background material on one of the principal tools used in quantitative equity management, financial econometrics. Coverage in Chapter 2 includes modern regression theory, applications of Random Matrix Theory, and robust methods. In Chapter 3, we extend our coverage of financial economics to dynamic models of times series, vector autoregressive models, and cointegration analysis. Financial engineering, the many pitfalls of estimation, and methods to control model risk are the subjects of Chapter 4. In Chapter 5, we introduce the modern theory of factor models, including approximate factor models and dynamic factor models.
Trading strategies based on factors and factor models are the focus of Chapters 6 and 7. In these chapters we offer a modern view on how to construct factor models based on fundamental factors and how to design and test trading strategies based on these. We offer a wealth of practical examples on the application of factor models in these chapters.
The coverage in Chapters 8, 9, and 10 is on the use of optimization models in quantitative equity management. The basics of portfolio optimization are reviewed in Chapter 9, followed by a discussion of the Bayesian approach to investment management as implemented in the Black-Litterman framework in Chapter 9. In Chapter 10 we discuss robust optimization techniques because they have greatly enhanced the ability to implement portfolio optimization models in practice.
The last two chapters of the book cover the important topic of trading costs and trading techniques. In Chapter 11, our focus is on the issues related to trading cost and implementation of trading strategies from a practical point of view. The modern techniques of algorithmic trading are the subject of the final chapter in the book, Chapter 12.
There are three appendixes. Appendix A provides a description of the data and factor definitions used in the illustrations and examples in the book. A summary of the factors, their economic rationale, and references that have supported the use of each factor is provided in Appendix B. In Appendix C we provide a review of eigenvalues and eigenvectors.

TEACHING USING THIS BOOK

Many of the chapters in this book have been used in courses and workshops on quantitative investment management, econometrics, trading strategies and algorithmic trading. The topics of the book are appropriate for undergraduate advanced electives on investment management, and graduate students in finance, economics, or in the mathematical and physical sciences.
For a typical course it is natural to start with Chapters 1-3, 5, and 8 where the quantitative investment management industry, standard econometric techniques, and modern portfolio and asset pricing theory are reviewed. Important practical considerations such as model risk and its mitigation are presented in Chapter 4. Chapters 6 and 7 focus on the development of factor-based trading strategies and provide many practical examples. Chapters 9-12 cover the important topics of Bayesian techniques, robust optimization, and transaction cost modeling—by now standard tools used in quantitative portfolio construction in the financial industry. We recommend that a more advanced course covers these topics in some detail.
Student projects can be based on specialized topics such as the development of trading strategies (in Chapters 6 and 7), optimal execution, and algorithmic trading (in Chapters 11 and 12). The many references in these chapters, and in the rest of the book, provide a good starting point for research.

ACKNOWLEDGMENTS

We would like to acknowledge the assistance of several individuals who contributed to this book. Chapters 6 and 7 on trading strategies were coauthored with Joseph A. Cerniglia of of Aberdeen Asset Management Inc. Chapter 10 on robust portfolio optimization is coauthored with Dessislava Pachamanova of Babson College. Chapter 12 draws from a chapter by one of the authors and Lee Maclin, adjunct at the Courant Institute of Mathematical Sciences, New York University, that will appear in the Encyclopedia of Quantitative Finance, edited by Rama Cont and to be published by John Wiley & Sons.
We also thank Axioma, Inc. for allowing us to use several figures from its white paper series co-authored by Sebastian Ceria and Robert Stubbs.
Megan Orem typeset the book and provided editorial assistance. We appreciate her patience and understanding in working through numerous revisions.
Frank J. Fabozzi
Sergio M. Focardi
Petter N. Kolm
About the Authors
Frank J. Fabozzi is Professor in the Practice of Finance in the School of Management at Yale University and an Affiliated Professor at the University of Karlsruhe’s Institute of Statistics, Econometrics and Mathematical Finance. Prior to joining the Yale faculty, he was a Visiting Professor of Finance in the Sloan School at MIT. Frank is a Fellow of the International Center for Finance at Yale University and on the Advisory Council for the Department of Operations Research and Financial Engineering at Princeton University. He is the editor of the Journal of Portfolio Management. He is a trustee for the BlackRock family of closed-end funds. In 2002, Frank was inducted into the Fixed Income Analysts Society’s Hall of Fame and is the 2007 recipient of the C. Stewart Sheppard Award given by the CFA Institute. His recently coauthored books published by Wiley in include Institutional Investment Management (2009), Finance: Capital Markets, Financial Management and Investment Management (2009), Bayesian Methods in Finance (2008), Advanced Stochastic Models, Risk Assessment, and Portfolio Optimization: The Ideal Risk, Uncertainty, and Performance Measures (2008), Financial Modeling of the Equity Market: From CAPM to Cointegration (2006), Robust Portfolio Optimization and Management (2007), and Financial Econometrics: From Basics to Advanced Modeling Techniques (2007). Frank earned a doctorate in economics from the City University of New York in 1972. He earned the designation of Chartered Financial Analyst and Certified Public Accountant.
Sergio Focardi is Professor of Finance at the EDHEC Business School in Nice and the founding partner of the Paris-based consulting firm The Intertek Group. He is a member of the editorial board of the Journal of Portfolio Management. Sergio has authored numerous articles and books on financial modeling and risk management including the following Wiley books: Financial Econometrics (2007), Financial Modeling of the Equity Market (2006), The Mathematics of Financial Modeling and Investment Management (2004), Risk Management: Framework, Methods and Practice (1998), and Modeling the Markets: New Theories and Techniques (1997). He also authored two monographs published by the CFA Institute’s monographs: Challenges in Quantitative Equity Management (2008) and Trends in Quantitative Finance (2006). Sergio has been appointed as a speaker of the CFA Institute Speaker Retainer Program. His research interests include the econometrics of large equity portfolios and the modeling of regime changes. Sergio holds a degree in Electronic Engineering from the University of Genoa and a PhD in Mathematical Finance and Financial Econometrics from the University of Karlsruhe.
Petter N. Kolm is the Deputy Director of the Mathematics in Finance Masters Program and Clinical Associate Professor at the Courant Institute of Mathematical Sciences, New York University, and a Founding Partner of the New York-based financial consulting firm, the Heimdall Group, LLC. Previously, Petter worked in the Quantitative Strategies Group at Goldman Sachs Asset Management where his responsibilities included researching and developing new quantitative investment strategies for the group’s hedge fund. Petter authored the books Financial Modeling of the Equity Market: From CAPM to Cointegration (Wiley, 2006), Trends in Quantitative Finance (CFA Research Institute, 2006), and Robust Portfolio Management and Optimization (Wiley, 2007). His interests include high-frequency finance, algorithmic trading, quantitative trading strategies, financial econometrics, risk management, and optimal portfolio strategies. Petter holds a doctorate in mathematics from Yale University, an M.Phil. in applied mathematics from the Royal Institute of Technology in Stockholm, and an M.S. in mathematics from ETH Zürich. Petter is a member of the editorial board of the Journal of Portfolio Management.
CHAPTER 1
Introduction
An economy can be regarded as a machine that takes in input labor and natural resources and outputs products and services. Studying this machine from a physical point of view would be very difficult because we should study the characteristics and the interrelationships among all modern engineering and production processes. Economics takes a bird’s-eye view of these processes and attempts to study the dynamics of the economic value associated with the structure of the economy and its inputs and outputs. Economics is by nature a quantitative science, though it is difficult to find simple rules that link economic quantities.
In most economies value is presently obtained through a market process where supply meets demand. Here is where finance and financial markets come into play. They provide the tools to optimize the allocation of resources through time and space and to manage risk. Finance is by nature quantitative like economics but it is subject to a large level of risk. It is the measurement of risk and the implementation of decision-making processes based on risk that makes finance a quantitative science and not simply accounting.
Equity investing is one of the most fundamental processes of finance. Equity investing allows allocating the savings of the households to investments in the productive activities of an economy. This investment process is a fundamental economic enabler: without equity investment it would be very difficult for an economy to properly function and grow. With the diffusion of affordable fast computers and with progress made in understanding financial processes, financial modeling has become a determinant of investment decision-making processes. Despite the growing diffusion of financial modeling, objections to its use are often raised.
In the second half of the 1990s, there was so much skepticism about quantitative equity investing that David Leinweber, a pioneer in applying advanced techniques borrowed from the world of physics to fund management, and author of Nerds on Wall Street,1 wrote an article entitled: “Is quantitative investment dead?”2 In the article, Leinweber defended quantitative fund management and maintained that in an era of ever faster computers and ever larger databases, quantitative investment was here to stay. The skepticism toward quantitative fund management, provoked by the failure of some high-profile quantitative funds at that time, was related to the fact that investment professionals felt that capturing market inefficiencies could best be done by exercising human judgment.
Despite mainstream academic opinion that held that markets are efficient and unpredictable, the asset managers’ job is to capture market inefficiencies and translate them into enhanced returns for their clients. At the academic level, the notion of efficient markets has been progressively relaxed. Empirical evidence led to the acceptance of the notion that financial markets are somewhat predictable and that systematic market inefficiencies can be detected. There has been a growing body of evidence that there are market anomalies that can be systematically exploited to earn excess profits after considering risk and transaction costs.3 In the face of this evidence, Andrew Lo proposed replacing the efficient market hypothesis with the adaptive market hypothesis as market inefficiencies appear as the market adapts to changes in a competitive environment.
In this scenario, a quantitative equity investment management process is characterized by the use of computerized rules as the primary source of decisions. In a quantitative process, human intervention is limited to a control function that intervenes only exceptionally to modify decisions made by computers. We can say that a quantitative process is a process that quantifies things. The notion of quantifying things is central to any modern science, including the dismal science of economics. Note that everything related to accounting—balance sheet/income statement data, and even accounting at the national level—is by nature quantitative. So, in a narrow sense, finance has always been quantitative. The novelty is that we are now quantifying things that are not directly observed, such as risk, or things that are not quantitative per se, such as market sentiment and that we seek simple rules to link these quantities
In this book we explain techniques for quantitative equity investing. Our purpose in this chapter is threefold. First, we discuss the relationship between mathematics and equity investing and look at the objections raised. We attempt to show that most objections are misplaced. Second, we discuss the results of three studies based on surveys and interviews of major market participants whose objective was to quantitative equity portfolio management and their implications for equity portfolio managers. The results of these three studies are helpful in understanding the current state of quantitative equity investing, trends, challenges, and implementation issues. Third, we discuss the challenges ahead for quantitative equity investing.

IN PRAISE OF MATHEMATICAL FINANCE

Is the use of mathematics to describe and predict financial and economic phenomena appropriate? The question was first raised at the end of the nineteenth century when Vilfredo Pareto and Leon Walras made an initial attempt to formalize economics. Since then, financial economic theorists have been divided into two camps: those who believe that economics is a science and can thus be described by mathematics and those who believe that economic phenomena are intrinsically different from physical phenomena which can be described by mathematics.
In a tribute to Paul Samuelson, Robert Merton wrote:
Although most would agree that finance, micro investment theory and much of the economics of uncertainty are within the sphere of modern financial economics, the boundaries of this sphere, like those of other specialties, are both permeable and flexible. It is enough to say here that the core of the subject is the study of the individual behavior of households in the intertemporal allocation of their resources in an environment of uncertainty and of the role of economic organizations in facilitating these allocations. It is the complexity of the interaction of time and uncertainty that provides intrinsic excitement to study of the subject, and, indeed, the mathematics of financial economics contains some of the most interesting applications of probability and optimization theory. Yet, for all its seemingly obtrusive mathematical complexity, the research has had a direct and significant influence on practice4
The three principal objections to treating finance economic theory as a mathematical science we will discuss are that (1) financial markets are driven by unpredictable unique events and, consequently, attempts to use mathematics to describe and predict financial phenomena are futile, (2) financial phenomena are driven by forces and events that cannot be quantified, though we can use intuition and judgment to form a meaningful financial discourse, and (3) although we can indeed quantify financial phenomena, we cannot predict or even describe financial phenomena with realistic mathematical expressions and/or computational procedures because the laws themselves change continuously.
A key criticism to the application of mathematics to financial economics is the role of uncertainty. As there are unpredictable events with a potentially major impact on the economy, it is claimed that financial economics cannot be formalized as a mathematical methodology with predictive power. In a nutshell, the answer is that black swans exist not only in financial markets but also in the physical sciences. But no one questions the use of mathematics in the physical sciences because there are major events that we cannot predict. The same should hold true for finance. Mathematics can be used to understand financial markets and help to avoid catastrophic events.5 However, it is not necessarily true that science and mathematics will enable unlimited profitable speculation. Science will allow one to discriminate between rational predictable systems and highly risky unpredictable systems.
There are reasons to believe that financial economic laws must include some fundamental uncertainty. The argument is, on a more general level, the same used to show that there cannot be arbitrage opportunities in financial markets. Consider that economic agents are intelligent agents who can use scientific knowledge to make forecasts.
Were financial economic laws deterministic, agents could make (and act on) deterministic forecasts. But this would imply a perfect consensus between agents to ensure that there is no contradiction between forecasts and the actions determined by the same forecasts. For example, all investment opportunities should have exactly identical payoffs. Only a perfectly and completely planned economy can be deterministic; any other economy must include an element of uncertainty.
In finance, the mathematical handling of uncertainty is based on probabilities learned from data. In finance, we have only one sample of small size and cannot run tests. Having only one sample, the only rigorous way to apply statistical models is to invoke ergodicity. An ergodic process is a stationary process where the limit of time averages is equal to time-invariant ensemble averages. Note that in financial modeling it is not necessary that economic quantities themselves form ergodic processes, only that residuals after modeling form an ergodic process. In practice, we would like the models to extract all meaningful information and leave a sequence of white noise residuals.
If we could produce models that generate white noise residuals over extended periods of time, we would interpret uncertainty as probability and probability as relative frequency. However, we cannot produce such models because we do not have a firm theory known a priori. Our models are a combination of theoretical considerations, estimation, and learning; they are adaptive structures that need to be continuously updated and modified.
Uncertainty in forecasts is due not only to the probabilistic uncertainty inherent in stochastic models but also to the possibility that the models themselves are misspecified. Model uncertainty cannot be measured with the usual concept of probability because this uncertainty itself is due to unpredictable changes. Ultimately, the case for mathematical financial economics hinges on our ability to create models that maintain their descriptive and predictive power even if there are sudden unpredictable changes in financial markets. It is not the large unpredictable events that are the challenge to mathematical financial economics, but our ability to create models able to recognize these events.
This situation is not confined to financial economics. It is now recognized that there are physical systems that are totally unpredictable. These systems can be human artifacts or natural systems. With the development of nonlinear dynamics, it has been demonstrated that we can build artifacts whose behavior is unpredictable. There are examples of unpredictable artifacts of practical importance. Turbulence, for example, is a chaotic phenomenon. The behavior of an airplane can become unpredictable under turbulence. There are many natural phenomena from genetic mutations to tsunami and earthquakes whose development is highly nonlinear and cannot be individually predicted. But we do not reject mathematics in the physical sciences because there are events that cannot be predicted. On the contrary, we use mathematics to understand where we can find regions of dangerous unpredictability. We do not knowingly fly an airplane in extreme turbulence and we refrain from building dangerous structures that exhibit catastrophic behavior. Principles of safe design are part of sound engineering.
Financial markets are no exception. Financial markets are designed artifacts: we can make them more or less unpredictable. We can use mathematics to understand the conditions that make financial markets subject to nonlinear behavior with possibly catastrophic consequences. We can improve our knowledge of what variables we need to control in order to avoid entering chaotic regions.
It is therefore not reasonable to object that mathematics cannot be used in finance because there are unpredictable events with major consequences. It is true that there are unpredictable financial markets where we cannot use mathematics except to recognize that these markets are unpredictable. But we can use mathematics to make financial markets safer and more stable.6
Let us now turn to the objection that we cannot use mathematics in finance because the financial discourse is inherently qualitative and cannot be formalized in mathematical expressions. For example, it is objected that qualitative elements such as the quality of management or the culture of a firm are important considerations that cannot be formalized in mathematical expressions.
A partial acceptance of this point of view has led to the development of techniques to combine human judgment with models. These techniques range from simply counting analysts’ opinions to sophisticated Bayesian methods that incorporate qualitative judgment into mathematical models. These hybrid methodologies link models based on data with human overlays.
Is there any irreducibly judgmental process in finance? Consider that in finance, all data important for decision-making are quantitative or can be expressed in terms of logical relationships. Prices, profits, and losses are quantitative, as are corporate balance-sheet data. Links between companies and markets can be described through logical structures. Starting from these data we can construct theoretical terms such as volatility. Are there hidden elements that cannot be quantified or described logically?
Ultimately, in finance, the belief in hidden elements that cannot be either quantified or logically described is related to the fact that economic agents are human agents with a decision-making process. The operational point of view of Samuelson has been replaced by the neoclassical economics view that, apparently, places the accent on agents’ decision-making. It is curious that the agent of neoclassical economics is not a realistic human agent but a mathematical optimizer described by a utility function.
Do we need anything that cannot be quantified or expressed in logical terms? At this stage of science, we can say the answer is a qualified no, if we consider markets in the aggregate. Human behavior is predictable in the aggregate and with statistical methods. Interaction between individuals, at least at the level of economic exchange, can be described with logical tools. We have developed many mathematical tools that allow us to describe critical points of aggregation that might lead to those situations of unpredictability described by complex systems theory.
We can conclude that the objection of hidden qualitative variables should be rejected. If we work at the aggregate level and admit uncertainty, there is no reason why we have to admit inherently qualitative judgment. In practice, we integrate qualitative judgment with models because (presently) it would be impractical or too costly to model all variables. If we consider modeling individual decision-making at the present stage of science, we have no definitive answer. Whenever financial markets depend on single decisions of single individuals we are in the presence of uncertainty that cannot be quantified. However, we have situations of this type in the physical sciences and we do not consider them an obstacle to the development of a mathematical science.
Let us now address a third objection to the use of mathematics in finance. It is sometimes argued that we cannot arrive at mathematical laws in finance because the laws themselves keep on changing. This objection is somehow true. Addressing it has led to the development of methods specific to financial economics. First observe that many physical systems are characterized by changing laws. For example, if we monitor the behavior of complex artifacts such as nuclear reactors we find that their behavior changes with aging. We can consider these changes as structural breaks. Obviously one could object that if we had more information we could establish a precise time-invariant law. Still, if the artifact is complex and especially if we cannot access all its parts, we might experience true structural breaks. For example, if we are monitoring the behavior of a nuclear reactor we might not be able to inspect it properly. Many natural systems such as volcanoes cannot be properly inspected and structurally described. We can only monitor their behavior, trying to find predictive laws. We might find that our laws change abruptly or continuously. We assume that we could identify more complex laws if we had all the requisite information, though, in practice, we do not have this information.
These remarks show that the objection of changing laws is less strong than we might intuitively believe. The real problem is not that the laws of finance change continuously. The real problem is that they are too complex. We do not have enough theoretical knowledge to determine finance laws and, if we try to estimate statistical models, we do not have enough data to estimate complex models. Stated differently, the question is not whether we can use mathematics in financial economic theory. The real question is: How much information we can obtain in studying financial markets? Laws and models in finance are highly uncertain. One partial solution is to use adaptive models. Adaptive models are formed by simple models plus rules to change the parameters of the simple models. A typical example is nonlinear state-space models. Nonlinear state-space models are formed by a simple regression plus another process that adapts continuously the model parameters. Other examples are hidden Markov models that might represent prices as formed by sequences of random walks with different parameters.
We can therefore conclude that the objection that there is no fixed law in financial economics cannot be solved a priori. Empirically we find that simple models cannot describe financial markets over long periods of time: if we turn to adaptive modeling, we are left with a residual high level of uncertainty.
Our overall conclusion is twofold. First, we can and indeed should regard mathematical finance as a discipline with methods and mathematics specific to the type of empirical data available in the discipline. Given the state of continuous change in our economies, we cannot force mathematical finance into the same paradigm of classical mathematical physics based on differential equations. Mathematical finance needs adaptive, nonlinear models that are able to adapt in a timely fashion to a changing empirical environment.
This is not to say that mathematical finance is equivalent to data-mining. On the contrary, we have to use all available knowledge and theoretical reasoning on financial economics. However, models cannot be crystallized in time-invariant models. In the future, it might be possible to achieve the goal of stable time-invariant models but, for the moment, we have to admit that mathematical finance needs adaptation and must make use of computer simulations. Even with the resources of modern adaptive computational methods, there will continue to be a large amount of uncertainty in mathematical finance, not only as probability distributions embedded in models but also as residual model uncertainty. When changes occur, there will be disruption of model performance and the need to adapt models to new situations. But this does not justify rejecting mathematical finance. Mathematical finance can indeed tell us what situations are more dangerous and might lead to disruptions. Through simulations and models of complex structure, we can achieve an understanding of those situations that are most critical.
Economies and financial markets are engineered artifacts. We can use our science to engineer economic and financial systems that are safer or we can decide, in the end, to prefer risk-taking and its highly skewed rewards. Of course we might object that uncertainty about the path our societies will take is part of the global problem of uncertainty. This objection is the objection of complex system theorists to reductionism. We can study a system with our fundamental laws once we know the initial and boundary conditions but we cannot explain how initial and boundary conditions were formed. These speculations are theoretically important but we should avoid a sense of passive fatality. In practice, it is important that we are aware that we have the tools to design safer financial systems and do not regard the path towards unpredictability as inevitable.

STUDIES OF THE USE OFQUANTITATIVE EQUITY MANAGEMENT

There are three recent studies on the use of quantitative equity management conducted by Intertek Partners. The studies are based on surveys and interviews of market participants. We will refer to these studies as the 2003 Intertek European study,7 2006 Intertek study,8 and 2007 Intertek study.9

2003 Intertek European Study

The 2003 Intertek European study deals with the use of financial modeling at European asset management firms. It is based on studies conducted by The Intertek Group to evaluate model performance following the fall of the markets from their peak in March 2000, and explores changes that have occurred since then. In total, 61 managers at European asset management firms in the Benelux countries, France, Germany, Italy, Scandinavia, Switzerland, and the U.K. were interviewed. (The study does not cover alternative investment firms such as hedge funds.) At least half of the firms interviewed are among the major players in their respective markets, with assets under management ranging from €50 to €300 billion.
The major findings are summarized next.10

Greater Role for Models

In the two years following the March 2000 market highs, quantitative methods in the investment decision-making process began to play a greater role. Almost 75% of the firms interviewed reported this to be the case, while roughly 15% reported that the role of models had remained stable. The remaining 10% noted that their processes were already essentially quantitative. The role of models had also grown in another sense; a higher percentage of assets were being managed by funds run quantitatively. One firm reported that over the past two years assets in funds managed quantitatively grew by 50%.
Large European firms had been steadily catching up with their U.S. counterparts in terms of the breadth and depth of use of models. As the price of computers and computer software dropped, even small firms reported that they were beginning to adopt quantitative models. There were still differences between American and European firms, though. American firms tended to use relatively simple technology but on a large scale; Europeans tended to adopt sophisticated statistical methods but on a smaller scale.
Demand pull and management push were among the reasons cited for the growing role of models. On the demand side, asset managers were under pressure to produce returns while controlling risk; they were beginning to explore the potential of quantitative methods. On the push side, several sources remarked that, after tracking performance for several years, their management has made a positive evaluation of a model-driven approach against a judgment-driven decision-making process. In some cases, this led to a corporate switch to a quantitative decision-making process; in other instances, it led to shifting more assets into quantitatively managed funds.
Modeling was reported to have been extended over an ever greater universe of assets under management. Besides bringing greater structure and discipline to the process, participants in the study remarked that models helped contain costs. Unable to increase revenues in the period immediately following the March 2000 market decline, many firms were cutting costs. Modeling budgets, however, were reported as being largely spared. About 68% of the participants said that their investment in modeling had grown over the prior two years, while 50% expected their investments in modeling to continue to grow over the next year.
Client demand for risk control was another factor that drove the increased use of modeling. Pressure from institutional investors and consultants in particular continued to work in favor of modeling.
More generally, risk management was widely believed to be the key driving force behind the use of models.
Some firms mentioned they had recast the role of models in portfolio management. Rather than using models to screen and rank assets—which has been a typical application in Europe—they applied them after the asset manager had acted in order to measure the pertinence of fundamental analysis, characterize the portfolio style, eventually transform products through derivatives, optimize the portfolio, and track risk and performance.

Performance of Models Improves

Over one-half of the study’s participants responded that models performed better in 2002 than two years before. Some 20% evaluated 2002 model performance as stable with respect to two years ago, while another 20% considered that performance had worsened. Participants often noted that it was not models in general but specific models that had performed better or more poorly.
There are several explanations for the improved performance of models. Every model is, ultimately, a statistical device trained and estimated on past data. When markets began to fall from their peak in March 2000, models had not been trained on data that would have allowed them to capture the downturn—hence, the temporary poor performance of some models. Even risk estimates, more stable than expected return estimates, were problematic. In many cases, it was difficult to distinguish between volatility and model risk. Models have since been trained on new sets of data and are reportedly performing better.
From a strictly scientific and economic theory point of view, the question of model performance overall is not easy to address. The basic question is how well a theory describes reality, with the additional complication that in economics uncertainty is part of the theory. As we observed in the previous section, we cannot object to financial modeling but we cannot pretend a priori that model performance be good. Modeling should reflect the objective amount of uncertainty present in a financial process. The statement that “models perform better” implies that the level of uncertainty has changed. To make this discussion meaningful, clearly somehow we have to restrict the universe of models under consideration. In general, the uncertainty associated with forecasting within a given class of models is equated to market volatility. And as market volatility is not an observable quantity but a hidden one, it is model-dependent.11 In other words, the amount of uncertainty in financial markets depends on the accuracy of models. For instance, an ARCH-GARCH model will give an estimate of volatility different from that of a model based on constant volatility. On top of volatility, however, there is another source of uncertainty, which is the risk that the model is misspecified. The latter uncertainty is generally referred to as model risk.
The problem experienced when markets began to fall was that models could not forecast volatility simply because they were grossly misspecified. A common belief is that markets are now highly volatile, which is another way of saying that models do not do a good job of predicting returns. Yet models are now more coherent; fluctuations of returns are synchronized with expectations regarding volatility. Model risk has been reduced substantially.
Overall, the global perception of European market participants who participated in the study was that models are now more dependable. This meant that model risk had been reduced; although their ability to predict returns had not substantially improved, models were better at predicting risk. Practitioners’ evaluation of model performance can be summarized as follows: (1) models will bring more and more insight in risk management, (2) in stock selection, we will see some improvement due essentially to better data, not better models, and (3) in asset allocation, the use of models will remain difficult as markets remain difficult to predict.
Despite the improved performance of models, the perception European market participants shared was one of uncertainty as regards the macroeconomic trends of the markets. Volatility, structural change, and unforecastable events continue to challenge models. In addition to facing uncertainty related to a stream of unpleasant surprises as regards corporate accounting at large public firms, participants voiced the concern that there is considerable fundamental uncertainty on the direction of financial flows.
A widely shared evaluation was that, independent of models themselves, the understanding of models and their limits had improved. Most traders and portfolio managers had at least some training in statistics and finance theory; computer literacy was greatly increased. As a consequence, the majority of market participants understand at least elementary statistical analyses of markets.

Use of Multiple Models on the Rise

According to the 2003 study’s findings, three major trends had emerged in Europe over the prior few years: (1) a greater use of multiple models, (2) the modeling of additional new factors, and (3) an increased use of value-based models.
Let’s first comment on the use of multiple models from the point of view of modern financial econometrics, and in particular from the point of view of the mitigation of model risk. The present landscape of financial modeling applied to investment management is vast and well articulated.12
Financial models are typically econometric models, they do not follow laws of nature but are approximate models with limited validity. Every model has an associated model risk, which can be roughly defined as the probability that the model does not forecast correctly. Note that it does not make sense to consider model risk in abstract, against every possible assumption; model risk can be meaningfully defined only by restricting the set of alternative assumptions. For instance, we might compute measures of the errors made by an option pricing model if the underlying follows a distribution different from the one on which the model is based. Clearly it must be specified what families of alternative distributions we are considering.
Essentially every model is based on some assumption about the functional form of dependencies between variables and on the distribution of noise. Given the assumptions, models are estimated, and decisions made. The idea of estimating model risk is to estimate the distribution of errors that will be made if the model assumptions are violated. For instance: Are there correlations or autocorrelations when it is assumed there are none? Are innovations fat-tailed when it is assumed that noise is white and normal? From an econometric point of view, combining different models in this way means constructing a mixture of distributions. The result of this process is one single model that weights the individual models.
Some managers interviewed for the 2003 study reported they were using judgment on top of statistical analysis. This entails that models be reviewed when they begin to produce results that are below expectations. In practice, quantitative teams constantly evaluate the performance of different families of models and adopt those that perform better. Criteria for switching from one family of models to another are called for, though. This, in turn, requires large data samples.
Despite these difficulties, application of multiple models has gained wide acceptance in finance. In asset management, the main driver is the uncertainty related to estimating returns.

Focus on Factors, Correlation, Sentiment, and Momentum

Participants in the 2003 study also reported efforts to determine new factors that might help predict expected returns. Momentum and sentiment were the two most cited phenomena modeled in equities. Market sentiment, in particular, was receiving more attention.
The use of factor models is in itself a well-established practice in financial modeling. Many different families of models are available, from the widely used classic static return factor analysis models to dynamic factor models, both of which are described later in Chapter 5. What remains a challenge is determination of the factors. Considerable resources have been devoted to studying market correlations. Advanced techniques for the robust estimation of correlations are being applied at large firms as well as at boutiques.
According to study respondents, over the three years prior to 2001, quantitative teams at many asset management firms were working on determining which factors are the best indicators of price movements. Sentiment was often cited as a major innovation in terms of modeling strategies. Asset management firms typically modeled stock-specific sentiment, while sentiment as measured by business or consumer confidence was often the responsibility of the macroeconomic teams at the mother bank, at least in continental Europe. Market sentiment is generally defined by the distribution of analyst revisions in earnings estimates. Other indicators of market confidence are flows, volume, turnover, and trading by corporate officers.
Factors that represent market momentum were also increasingly adopted according to the study. Momentum means that the entire market is moving in one direction with relatively little uncertainty. There are different ways to represent momentum phenomena. One might identify a specific factor that defines momentum, that is, a variable that gauges the state of the market in terms of momentum. This momentum variable then changes the form of models. There are models for trending markets and models for uncertain markets.
Momentum can also be represented as a specific feature of models. A random walk model does not have any momentum, but an autoregressive model might have an intrinsic momentum feature.
Some participants also reported using market-timing models and style rotation for the active management of funds. Producing accurate timing signals is complex, given that financial markets are difficult to predict. One source of predictability is the presence of mean reversion and cointegration phenomena.

Back to Value-Based Models

At the time of the 2003 study, there was a widespread perception that value-based models were performing better in post-2000 markets. It was believed that markets were doing a better job valuing companies as a function of the value of the firm rather than price trends, notwithstanding our remarks on the growing use of factors such as market sentiment. From a methodological point of view, methodologies based on cash analysis had increased in popularity in Europe. A robust positive operating cash flow is considered to be a better indication of the health of a firm than earnings estimates, which can be more easily massaged.
Fundamental analysis was becoming highly quantitative and automated. Several firms mentioned they were developing proprietary methodologies for the automatic analysis of balance sheets. For these firms, with the information available on the World Wide Web, fundamental analysis could be performed without actually going to visit firms. Some participants remarked that caution might be called for in attributing the good performance of value-tilted models to markets. One of the assumptions of value-based models is that there is no mechanism that conveys a large flow of funds through preferred channels, but this was the case in the telecommunications, media, and technology (TMT) bubble, when value-based models performed so poorly. In the last bull run prior to the study, the major preoccupation was to not miss out on rising markets; investors who continued to focus on value suffered poor performance. European market participants reported that they are now watching both trend and value.

Risk Management

Much of the attention paid to quantitative methods in asset management prior to the study had been focused on risk management. According to 83% of the participants, the role of risk management had evolved significantly over the prior two years to extend across portfolios and across processes.
One topic that has received a lot of attention, both in academia and at financial institutions, is the application of extreme value theory (EVT) to financial risk management.13 The RiskLab in Zurich, headed by Paul Embrechts, advanced the use of EVT and copula functions in risk management. At the corporate level, universal banks such as HSBC CCF have produced theoretical and empirical work on the applicability of EVT to risk management.14 European firms were also paying considerable attention to risk measures.
For participants in the Intertek study, risk management was the area where quantitative methods had made their biggest contribution. Since the pioneering work of Harry Markowitz in the 1950s, the objective of investment management has been defined as determining the optimal risk-return trade-off in an investor’s profile. Prior to the diffusion of modeling techniques, though, evaluation of the risk-return trade-off was left to the judgment of individual asset managers. Modeling brought to the forefront the question of ex ante risk-return optimization. An asset management firm that uses quantitative methods and optimization techniques manages risk at the source. In this case, the only risk that needs to be monitored and managed is model risk.15
Purely quantitative managers with a fully automated management process were still rare according to the study. Most managers, although quantitatively oriented, used a hybrid approach calling for models to give evaluations that managers translate into decisions. In such situations, risk is not completely controlled at the origin.
Most firms interviewed for the study had created a separate risk management unit as a supervisory entity that controls the risk of different portfolios and eventually—although still only rarely—aggregated risk at the firm-wide level. In most cases, the tools of choice for controlling risk were multifactor models. Models of this type have become standard when it comes to making risk evaluations for institutional investors. For internal use, however, many firms reported that they made risk evaluations based on proprietary models, EVT, and scenario analysis.

Integrating Qualitative and Quantitative Information

More than 60% of the firms interviewed for the 2003 Intertek study reported they had formalized procedures for integrating quantitative and qualitative input, although half of these mentioned that the process had not gone very far; 30% of the participants reported no formalization at all. Some firms mentioned they had developed a theoretical framework to integrate results from quantitative models and fundamental views. Assigning weights to the various inputs was handled differently from firm to firm; some firms reported establishing a weight limit in the range of 50%-80% for quantitative input.
A few quantitative-oriented firms reported that they completely formalized the integration of qualitative and quantitative information. In these cases, everything relevant was built into the system. Firms that both quantitatively managed and traditionally managed funds typically reported that formalization was implemented in the former but not in the latter.
Virtually all firms reported at least a partial automation in the handling of qualitative information. For the most part, a first level of automation—including automatic screening and delivery, classification, and search—is provided by suppliers of sell-side research, consensus data, and news. These suppliers are automating the delivery of news, research reports, and other information.
About 30% of the respondents note they have added functionality over and above that provided by third-party information suppliers, typically starting with areas easy to quantify such as earnings announcements or analysts’ recommendations. Some have coupled this with quantitative signals that alert recipients to changes or programs that automatically perform an initial analysis.
Only the braver will be tackling difficult tasks such as automated news summary and analysis. For the most part, news analysis was still considered the domain of judgment. A few firms interviewed for this study reported that they attempted to tackle the problem of automatic news analysis, but abandoned their efforts. The difficulty of forecasting price movements related to new information was cited as a motivation.

2006 Intertek Study

The next study that we will discuss is based on survey responses and conversations with industry representatives in 2006. Although this predates the subprime mortgage crisis and the resulting impact on the performance of quantitative asset managers, the insights provided by this study are still useful. In all, managers at 38 asset management firms managing a total of $4.3 trillion in equities participated in the study. Participants included individuals responsible for quantitative equity management and quantitative equity research at large- and medium-sized firms in North America and Europe.16 Sixty-three percent of the participating firms were among the largest asset managers in their respective countries; they clearly represented the way a large part of the industry was going with respect to the use of quantitative methods in equity portfolio management.17