90,99 €
Modelling has permeated virtually all areas of industrial, environmental, economic, bio-medical or civil engineering: yet the use of models for decision-making raises a number of issues to which this book is dedicated: How uncertain is my model ? Is it truly valuable to support decision-making ? What kind of decision can be truly supported and how can I handle residual uncertainty ? How much refined should the mathematical description be, given the true data limitations ? Could the uncertainty be reduced through more data, increased modeling investment or computational budget ? Should it be reduced now or later ? How robust is the analysis or the computational methods involved ? Should / could those methods be more robust ? Does it make sense to handle uncertainty, risk, lack of knowledge, variability or errors altogether ? How reasonable is the choice of probabilistic modeling for rare events ? How rare are the events to be considered ? How far does it make sense to handle extreme events and elaborate confidence figures ? Can I take advantage of expert / phenomenological knowledge to tighten the probabilistic figures ? Are there connex domains that could provide models or inspiration for my problem ? Written by a leader at the crossroads of industry, academia and engineering, and based on decades of multi-disciplinary field experience, Modelling Under Risk and Uncertainty gives a self-consistent introduction to the methods involved by any type of modeling development acknowledging the inevitable uncertainty and associated risks. It goes beyond the "black-box" view that some analysts, modelers, risk experts or statisticians develop on the underlying phenomenology of the environmental or industrial processes, without valuing enough their physical properties and inner modelling potential nor challenging the practical plausibility of mathematical hypotheses; conversely it is also to attract environmental or engineering modellers to better handle model confidence issues through finer statistical and risk analysis material taking advantage of advanced scientific computing, to face new regulations departing from deterministic design or support robust decision-making. Modelling Under Risk and Uncertainty: * Addresses a concern of growing interest for large industries, environmentalists or analysts: robust modeling for decision-making in complex systems. * Gives new insights into the peculiar mathematical and computational challenges generated by recent industrial safety or environmental control analysis for rare events. * Implements decision theory choices differentiating or aggregating the dimensions of risk/aleatory and epistemic uncertainty through a consistent multi-disciplinary set of statistical estimation, physical modelling, robust computation and risk analysis. * Provides an original review of the advanced inverse probabilistic approaches for model identification, calibration or data assimilation, key to digest fast-growing multi-physical data acquisition. * Illustrated with one favourite pedagogical example crossing natural risk, engineering and economics, developed throughout the book to facilitate the reading and understanding. * Supports Master/PhD-level course as well as advanced tutorials for professional training Analysts and researchers in numerical modeling, applied statistics, scientific computing, reliability, advanced engineering, natural risk or environmental science will benefit from this book.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 912
Veröffentlichungsjahr: 2012
Contents
Cover
Wiley Series in Probability and Statistics
Title Page
Copyright
Dedication
Preface
Acknowledgements
Introduction and Reading Guide
1 The Scope of Risk and Uncertainty Considered
2 A Journey Through an Uncertain Reality
3 The Generic Methodological Approach of the Book
4 Book Positioning and Related Literature
5 A Reading Guide Through the Chapters
References
Notation
Acronyms and Abbreviations
Chapter 1: Applications and Practices of Modelling, Risk and Uncertainty
1.1 Protection Against Natural Risk
1.2 Engineering Design, Safety and Structural Reliability Analysis (SRA)
1.3 Industrial Safety, System Reliability and Probabilistic Risk Assessment (PRA)
1.4 Modelling Under Uncertainty in Metrology, Environmental/Sanitary Assessment and Numerical Analysis
1.5 Forecast and Time-Based Modelling in Weather, Operations Research, Economics or Finance
1.6 Conclusion: The Scope for Generic Modelling Under Risk and Uncertainty
References
Chapter 2: A Generic Modelling Framework
2.1 The System Under Uncertainty
2.2 Decisional Quantities and Goals of Modelling Under Risk and Uncertainty
2.3 Modelling Under Uncertainty: Building Separate System and Uncertainty Models
2.4 Modelling Under Uncertainty – The General Case
2.5 Combining Probabilistic and Deterministic Settings
2.6 Computing an Appropriate Risk Measure or Quantity of Interest and Associated Sensitivity Indices
2.7 Summary: Main Steps of the Studies and Later Issues
References
Chapter 3: A Generic Tutorial Example: Natural Risk in an Industrial Installation
3.1 Phenomenology and Motivation of the Example
3.2 A Short Introduction to Gradual Illustrative Modelling Steps
3.3 Summary of the Example
References
Chapter 4: Understanding Natures of Uncertainty, Risk Margins and Time Bases for Probabilistic Decision-Making
4.1 Natures of Uncertainty: Theoretical Debates and Practical Implementation
4.2 Understanding the Impact on Margins of Deterministic vs. Probabilistic Formulations
4.3 Handling Time-Cumulated Risk Measures Through Frequencies and Probabilities
4.4 Choosing an Adequate Risk Measure – Decision-Theory Aspects
References
Chapter 5: Direct Statistical Estimation Techniques
5.1 The General Issue
5.2 Introducing Estimation Techniques on Independent Samples
5.3 Modelling Dependence
5.4 Controlling Epistemic Uncertainty Through Classical or Bayesian Estimators
5.5 Understanding Rare Probabilities and Extreme Value Statistical Modelling
References
Chapter 6: Combined Model Estimation Through Inverse Techniques
6.1 Introducing Inverse Techniques
6.2 One-Dimensional Introduction of the Gradual Inverse Algorithms
6.3 The General Structure of Inverse Algorithms: Residuals, Identifiability, Estimators, Sensitivity and Epistemic Uncertainty
6.4 Specificities for Parameter Identification, Calibration or Data Assimilation Algorithms
6.5 Intrinsic Variability Identification
6.6 Conclusion: The Modelling Process and Open Statistical and Computing Challenges
References
Chapter 7: Computational Methods for Risk and Uncertainty Propagation
7.1 Classifying the Risk Measure Computational Issues
7.2 The Generic Monte-Carlo Simulation Method and Associated Error Control
7.3 Classical Alternatives to Direct Monte-Carlo sampling
7.4 Monotony, Regularity and Robust Risk Measure Computation
7.5 Sensitivity Analysis and Importance Ranking
7.6 Numerical Challenges, Distributed Computing and use of Direct or Adjoint Differentiation of Codes
References
Chapter 8: Optimising under Uncertainty: Economics and Computational Challenges
8.1 Getting the Costs Inside Risk Modelling – from Engineering Economics to Financial Modelling
8.2 The Role of Time – Cash Flows and Associated Risk Measures
8.3 Computational Challenges Associated to Optimisation
8.4 The Promise of High Performance Computing
Exercises
References
Chapter 9: Conclusion: Perspectives of Modelling in the Context of Risk and Uncertainty and Further Research
9.1 Open Scientific Challenges
9.2 Challenges Involved by the Dissemination of Advanced Modelling in the Context of Risk and Uncertainty
References
Chapter 10: Annexes
10.1 Annex 1 – Refresher on Probabilities and Statistical Modelling of Uncertainty
10.2 Annex 2 – Comments About the Probabilistic Foundations of the Uncertainty Models
10.3 Annex 3 – Introductory Reflections on the Sources of Macroscopic Uncertainty
10.4 Annex 4 – Details about the Pedagogical Example
10.5 Annex 5 – Detailed Mathematical Demonstrations
References
Epilogue
Index
WILEY SERIES IN PROBABILITY AND STATISTICS
Wiley Series in Probability and Statistics
Established by WALTER A. SHEWHART and SAMUEL S. WILKS
Editors
David J. Balding, Noel A.C. Cressie, Garrett M. Fitzmaurice,
Harvey Goldstein, Iain M. Johnstone, Geert Molenberghs,
David W. Scott, Adrian F.M. Smith, Ruey S. Tsay,
Sanford Weisberg
Editors Emeriti
Vic Barnett, Ralph A. Bradley, J. Stuart Hunter, J.B. Kadane, David G. Kendall, Jozef L. Teugels
A complete list of the titles in this series appears at the end of this volume.
This edition first published 2012
© 2012 John Wiley & Sons, Ltd
Registered office
John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, United Kingdom
For details of our global editorial offices, for customer services and for information about how to apply for permission to reuse the copyright material in this book please see our website at www.wiley.com.
The right of the author to be identified as the author of this work has been asserted in accordance with the Copyright, Designs and Patents Act 1988.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by the UK Copyright, Designs and Patents Act 1988, without the prior permission of the publisher.
Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic books.
Designations used by companies to distinguish their products are often claimed as trademarks. All brand names and product names used in this book are trade names, service marks, trademarks or registered trademarks of their respective owners. The publisher is not associated with any product or vendor mentioned in this book. This publication is designed to provide accurate and authoritative information in regard to the subject matter covered. It is sold on the understanding that the publisher is not engaged in rendering professional services. If professional advice or other expert assistance is required, the services of a competent professional should be sought.
Library of Congress Cataloging-in-Publication Data
Rocquigny, Etienne de.
Modelling under risk and uncertainty: an introduction to statistical, phenomenological, and computational methods / Etienne de Rocquigny.
p. cm. – (Wiley series in probability and statistics)
Includes bibliographical references and index.
ISBN 978-0-470-69514-2 (hardback)
1. Industrial management–Mathematical models. 2. Uncertainty–Mathematical models. 3. Risk management–Mathematical models. I. Title.
HD30.25.R63 2012
338.501'5195–dc23
2011046726
Luke 13: 4-5
Or those eighteen on whom the tower at Siloam fell, do you suppose they had failed in their duty more than all the rest of the people who live in Jerusalem?
Preface
This book is about modelling in general, as a support for decision-making in man-made systems and/or in relation to life and the natural environment. It is also about uncertainty: in the view of the author, modelling is so connected to uncertainty that the expressions of ‘uncertainty modeling’ or ‘model uncertainty’ are fraught with pleonasm. It is just as much about risk modelling, meaning the crafting of models to help risk assessment and support decision-making to prevent risk. Incidentally, controlling the risk generated by the use of models – a critical issue for many analysts of the world financial crises since 2008 – is a key by-product of the analysis.
Coming from a traditional – that is deterministic – scientific and engineering background, the author experienced regular discomfort in acting as a young consulting engineer in the mid-1990s: (a) struggling to calibrate models on past figures before discovering horrendous errors in the most serious datasets and previous model-based expertise, or large deviations between client-provided data and his own field surveys; (b) being challenged on the confidence that could be placed in his models by acute clients or regulators; (c) being pressured by the same clients to reduce the study budget and hence to downsize data collection and model sophistication. A memorable case was that of a small village of the French Alps where the author had to draw an official line between the floodable and non-floodable areas under intense pressure for land development. A witty-enough client, the mayor, questioned the author as to whether the 1/100-yr return legal contours would change if account was made for missing memory about passed floods (which he knew of from the tales of an old relative), likely deviations of the river bottom over decades (which he knew of from being born there), inaccuracies in the datum of land in the village (which he knew of as being also a farm owner), or an increased study budget allowing for a more careful survey and modelling exercise (which he eventually made available to the author). The nights were unsettled after having also understood the extent of responsibility held by the consulting engineers in signing such uncertain contours.
Since that time, the author has moved quite a long way into the science and techniques of modelling in the context of risk and uncertainty, assuming responsibilities in industrial R&D, environmental management, nuclear risk assessment and the development of large academic research programmes and education on the topic. A pedagogical example – on flood protection of an industrial facility – that has long been used as a tutorial and benchmark for that work and which illustrates the entire development of this book, is directly inspired by the live case. The writing of the book was almost complete when the tragic events of March 2011 occurred at Fukushima/Sendai, recalling, if necessary, the dramatic impact of natural and technological risk and the core need for careful design, safety and maintenance procedures. The author is hopeful that the book may serve modellers to better support those decision makers facing uncertainty.
Acknowledgements
The author wishes to thank everyone who helped make this book possible. The book could not have existed without the debt owed by the author to his former employers Sogreah and Electricité de France, being richly inspired by his close collaborators and partners such as Serge Hugonnard and Yannick Lefebvre in particular as well as Emmanuel Remy, Christian Derquenne, Laurent Carraro and Gilles Celeux.
Nicolas Devictor, Stefano Tarantola, Philip Limbourg and the other co-authors of the Wiley book ‘Uncertainty in Industrial Practice’ and the European cross-industry network, as well as its first-class reviewers Terje Aven and Jon Helton, helped in crafting and challenging the generic modelling approach in early versions of the book. Others such as Nicolas Fischer, Erick Herbin, Alberto Pasanisi, Fabien Mangeant, Bertrand Iooss and Regis Farret accompanied the author in launching pioneering executive education programmes and Master's/PhD level courses at the Ecole Centrale Paris as well as providing key opportunities to complete the pedagogical approach and tutorial examples. Numerous Master's, PhD students and industrial attendees involved in those initiatives also questioned and helped refine the material through years of teaching practice.
Thanks are also merited by a large number of reviewers including Jérôme Collet, Bertrand Iooss, Emilia Ferrario and Enrico Zio, Yann Richet, Pierre Bertrand, Bernard Gourion, Guillaume Malingue and Eric Walter.
In the long term, moral support and encouragement from Aude Massiet du Biest proved essential.
Boscrocourt (France), October 3rd, 2011
Introduction and Reading Guide
Modelling – a many centuries old activity – has permeated virtually all areas of industrial, environmental, economic, bio-medical or civil engineering. The driving forces behind such a development are scientific knowledge, technological development and cheaper information technologies that deliver massive computer power and data availability. Essential to any modelling approach, the issues of proper qualification, calibration and control of associated error, as well as uncertainty or deviations from reality have arisen simultaneously as the subjectof growing interest and as areas of new expertise in themselves.
Indeed, the search for more advanced management of uncertainty in modelling, design and risk assessment has developed greatly in large-scale industrial and environmental systems with the increase in safety and environmental control. Even more so, an appetite for more accountable decision-making and for greater confidence in predictions has found a general audience in the fields of weather forecasting, health and public policy. In parallel, robust design approaches and the valuation of flexibility through option-based decision-making are more and more disseminated in various industrial sectors in the quest for an increase in customer satisfaction, design and manufacturing cost reduction and market flexibility.
Faced by such demands, a systematic approach to uncertainty linked to the modelling of complex systems proves central in establishing a robust and convincing approach to the optimisation of safety and performance margins. The applications to engineering design, environmental management and risk management are numerous; embracing also concerns for the wider issues of resource allocation involved in modelling-based activity be it in R&D, consulting, policy-making or applied research. Strategic choices are required in order to focus data collection, computing and analytical resources or research budgets on areas where the lack of knowledge or uncertainty proves to have the greatest importance from the perspective of the ultimate decision-making goal instead of the favourite or best-known field of expertise.
This book concentrates on quantitative models and quantifiable uncertainty or risk indicators. This may sound somewhat restrictive to those readers having in mind real-world situations. On the one hand, human and organisational factors play an essential role, as evidenced by the famous risk examples of the Three-Mile-Island accident (1979) in the nuclear field or the Space Shuttle Challenger disaster (1986) in aerospace. On the other, uncertainty is to a large extent poorly or not quantifiable at all. Think about quantifying the lack of knowledge about the plausibility of the September 11th terrorist attacks (2001) or another similar event. Eventually, decision-making involves much more than an interaction with analytics and quantitative figures. Those aspects are evidence of the limitations of the applicability of the book's methods, although some limited modelling contributions may still apply here and there, such as human reliability probabilistic models, or the extended deterministic-probabilistic settings that account for poorly quantifiable occurrences. The challenging frontiers of statistical-based modelling also involve some fundamental mathematical properties, introduced intuitively as follows: (a) phenomenological stationarity: the elements of the system – if not the system itself – need to behave similarly in a statistical and/or phenomenological sense for a model-based inference to be of help; unseen combinations or extrapolations of known elementary phenomena could be considered, but not completely-disruptive phenomenology; (b) limited range of dependence: strong dependence between far-flung parts of a system or distant events in time also cause trouble for statistical and computational methods, if not the credibility of the models themselves. Experts in the rising sciences of complex systems including the celebrated author of ‘the Black Swan’ (Taleb, 2007) would qualify ‘true uncertainty’ as being characterised precisely by strong deviations from those properties. Although this book introduces a number of mathematical extensions of those cases, they remain as strong scientific bottlenecks demanding care, modesty and more research.
In spite of those limitations, modelling brings with it essential tools for the existing and future regulation of industrial activity or environmental control, as well as for decision-making in corporate investment, public infrastructure or health. Consider the dramatic events that occurred in the wake of the Sendai earthquake (March 2011) and the consequences of the Fukushima nuclear disruptions. Although the natural hazards outside the nuclear plant have so far dominated in terms of the number of fatalities which occurred along the densely-populated coast, the events at the nuclear plant had major consequences for nuclear safety and acceptability. Pre-existing statistical and geological evidence on the local tsunami hazard (not to mention more sophisticated modelling of the quake and the hydrodynamics of the tsunami) could have been valued directly in the re-assessment of plant safety and in particular to remedy shortcomings in the height of the dike and the location of emergency power redundancies (see Chapter 3). To a lesser extent, probably because of the complexity of modelling world economic behaviour, the massive crises following the 2007–2008 market failures find part of their explanation with the over-estimation of risk mutualisation within extremely complex portfolios which hid a very high level of dependence upon risky assets in the American real estate sub-primes, a basic feature easily understood in standard risk modelling (see Chapter 4).
1 The Scope of Risk and Uncertainty Considered
Risk and uncertainty are closely linked concepts and distinguishing them is not free from controversy. Facing up to the hard task of defining these two concepts precisely, some preliminary comments will be made at this stage. Uncertainty is the subject of long-standing epistemological interest as it stands with a fundamental connection both to any type of modelling activity and the scientific consideration of risk. The former relates to the fact that any modelling endeavour brings with it a more or less explicit consideration of the uncertain deviation of the model from empiric evidence. The latter focuses on the detrimental (or beneficial) consequences of uncertain events in terms of some given stakes, assets or vulnerability of interest for the decision-maker. Risk and uncertainty analysis prove to be so connected in application that the precise delimitation between the practical meanings of the two terms will not appear to be central with regard to modelling: it depends essentially on the definition of the underlying system or on the terminological habits of the field under consideration. The system considered generally includes the undesired consequences or performance indicators of a given industrial facility or environmental asset in a fully-fledged risk assessment, while it may be limited to a technical or modelling sub-part seen from a narrower perspective when talking about uncertainty or sensitivity analysis. Chapter 1 reviews an entire spectrum of applications from natural risk, industrial design, safety or process optimisation to environmental impact control and metrological or numerical code validation. The underlying concepts will prove greatly similar in spite of the apparent terminological variety.
Indeed, from the perspective of the book, the practicality of studying the uncertainty or risk associated with a given system is generated by the following common features: (i) the fact that the state of the system considered, conditional upon taking some given actions, is imperfectly known at a given time; and (ii) the fact that some of the characteristics of the state of the system, incorporated in a given type of ‘performance’ or ‘consequence’, are at stake for the decision-maker. Because of (i), the best that may be looked for are possible or likely ranges for the variables of interest quantifying those characteristics. More specifically, any inference will be made under a probabilistically defined quantity of interest or risk measure,1 such as event probabilities, coefficients of variation of the best-estimate, confidence intervals around the prediction, values-at-risk and so on. The rationale of risk or uncertainty modelling is to estimate those quantities through the aggregation of all statistical or non-statistical information available on any type of variable linked to the system, with the piece of information brought by statements depicting the system's structure and phenomenology (physical laws, accident sequence analysis, etc.). Eventually, the goal is to help in making decisions about appropriate actions in the context of a relative lack of knowledge about the state of the system and given stakes of interest: a model and the associated risk measure are only suitable for that given purpose and there is no such criterion as an absolute quality of modelling or recommended degree of sophistication.
Combining both statistical and phenomenological models aims to produce the best-informed inference, hopefully less uncertain than a straightforward prediction obtained through pure empiric data (observed frequencies) or expert opinion only. Note that from the book's perspective, risk modelling is much more than the derivation of self-grounded advanced mathematical tools as epitomised by the critiques of the role played by quantitative financial models in the 2008 market crises. It should be deeply grounded in statistical, phenomenological and domain-specific knowledge so that a proper validation (or invalidation) can be undertaken with a robust, uncertainty-explicit, approach. Representing honestly the quality of information – a dual concept of uncertainty – is about the most important recipe for building a sound model, and the book will insist on the key concepts available – such as model identifiability, uncertainty arising in the analysis process itself, quality of calibration and so on - that tend to be neglected by a tropism of analysts for quantitative and formal sophistication. In fact, an innovative practical measure – the rarity index – proportioning the true needs of decision-making to the real amount of information available (given the crucial degree of independence of both data sources and/or sub-system failures) will be introduced in order to challenge the credibility of model-building and of the risk assessment figures (End of Chapter 5, Chapter 7).
A large variety of causes or considerations gives rise to practical uncertainty in the state of a system as defined above in (i). This includes: uncertain inputs and operating conditions in the industrial processes; model inaccuracy (simplified equations, numerical errors, etc.); metrological errors; operational unknowns; unpredictable (or random) meteorological influences; natural variability of the environment; conflicting expert views and so on. A rich literature has described the variety of natures of uncertainty and discussed the key issue of whether they should or could receive the same type of quantification efforts, in particular a probabilistic representation. Such a debate may be traced back to the early theory of probability rooted in the seventeenth century while modern thinking may be inherited from economics and decision theory, closely linked to the renewed interpretation of statistics. Knight (1921) introduced the famous distinction between ‘risk’ (i.e. unpredictability with known probabilities) and ‘uncertainty’ (i.e. unpredictability with unknown, imprecise probabilities or even not subject to probabilisation). It is less often remembered how these early works already admitted the subtleties and limitations incorporated in such a simplified interpretation regarding real physical systems that greatly depart from the type of closed-form quantitative lotteries studied in the economic or decision-theory literature.
A closer look reveals evidence that various forms of uncertainty, imprecision, variability, randomness and model errors are mixed inside phenomenological data and modelling. Tackling real-scale physical systems within the risk assessment community occupied to a large extent the 1980s and 1990s, closely linked to the development of US nuclear safety reviews or environmental impact assessments (Granger Morgan and Henrion, 1990; Helton, 1993). The classification of the large variety of types of uncertainty encountered in large industrial systems (Oberkampf et al., 2002) has been notably debated with regard to their classification into two salient categories. The rationale for separating or not those two categories: namely the epistemic or reducible type which refers to uncertainty that decreases with the injection of more data, modelling or the number of runs; and the aleatory or irreducible type (or variability) for which there is a variation of the true characteristics across the time and space of an underlying population that may not be reduced by the increase of data or knowledge. This epistemic/aleatory distinction, which may be viewed as a refinement of the earlier uncertainty/risk distinction, will be discussed in the subsequent chapters as it gives rise to a variety of distinct probabilistic risk measures. While many recent works experimented with and discussed the use of extra-probabilistic settings such as evidence theory for the representation of epistemic uncertainty (cf. the review in Helton and Oberkampf, 2004), this book takes the more traditional view of double-probabilistic epistemic/aleatory modelling both because of its relatively more mature status in regulatory and industrial processes and the wealth of fundamental statistical tools that is available for error control of estimation and propagation procedures as well as more recent inverse probabilistic techniques.
2 A Journey Through an Uncertain Reality
Simple examples will help one to grasp the scope of this book and the reader is also invited to refer to de Rocquigny, Devictor, and Tarantola (2008) which gathers a large number of recent real-scale industrial case studies. Look firstly at the domain of metrology. Knowing the value of a quantitative variable representing the state or performance of the system requires an observational device which inevitably introduces a level of measurement uncertainty or error. Figure 1 illustrates the measurement results from a given device as compared to the reference values given by a metrological standard (i.e. a conventionally ‘finer’ measurement). Uncontrollable dispersion remains even after the best calibration efforts, the first materialisation of uncertainty. Such uncertainty will need to be estimated and modelled in order to undertake, for instance, a risk assessment of an emission control system.
Figure 1 Metrological uncertainty in the CO2 emissions of an industrial process.
Within the field of natural phenomena, temporal or spatial variability is ubiquitous and has a number of impacts: natural aggressions or environmental constraints on settlements or the operation of industrial facilities. Figure 2 -up illustrates the year-to-year natural variability of maximal flows from a French river. Because of the well-known sensitivity to initial conditions, predicting river flows through a deterministic phenomenological approach such as meteorological and hydrodynamic equations is limited to the short-term of a few days ahead for accurate trajectories. Techniques of forecasting, which will be briefly introduced in Chapter 6, have complemented such a deterministic inference by the calibration of explicit forecasting uncertainty allowing for probabilistic forecasts over days or weeks ahead or even for seasonal trends a few months ahead. Nevertheless, prediction of extreme events possibly happening once over decades is more the focus of risk models which are central to this book, in close relationship to the statistical extreme value theory. As the estimation of the extreme values of such flows is necessary when designing flood protection systems, it will typically be subject to a statistical model based on hydrological records: empiric records are of limited size by nature, hence generating a second type of estimation uncertainty linked to the statistical fluctuation of sampling (see Annex Section 10.1 for a refresher on modelling through a random variable).
Figure 2 (up) annual maximal flood flows from a French river – (down) observed stage-discharge relationship and competing hydraulic model parameterisations (Goutal et al., 2008).
Yet the variable of interest will generally be distinct from the variable for which records are available: a phenomenological model will be used so that the local water levels relevant to the protection system may be inferred. Calibration of such a model against available field data (see Figure 2 -down) raises the issue of controlling a form of ‘uncertain best fit’. Even for accurately-known flows, the prediction of water levels eventually exhibits residual dispersion that mixes measurement error and model and parametric uncertainty. This constitutes a third type of uncertainty that must be accounted for, quite distinct from the natural variability affecting the observed flows and more complex than direct estimation uncertainty. A number of techniques that are in the process of being developed could be used to work on those subjects. Uncertainty may stem from measurement error, improper model assumptions, imprecise spatial or temporal discretisation of the equations or unknown input parameters; in all cases, it needs to be combined to some extent with the distribution of natural variability in order to secure an overall robust design and a solid risk assessment procedure.
Uncertainty and unpredictability are also prevalent in man-made systems, motivating reliability and failure studies: for instance, failure dates are unknown and dispersed within a population of industrial components (Figure 3 up). Modelling such failure randomness is the key to assessing the (probabilistic) lifetime and reliability of components and hence guarantees the safety of entire systems aggregating redundant or complementary pieces. They are traditionally modelled through simple statistical models such as exponential or Weibull lifetime distributions. Looking at this in more detail, the phenomenology of such failures may be valuable with regard to achieving better control of its influencing factors or predicting the structural reliability of systems operating outside the data basis. This is done, for instance, in fracture mechanics where failure modes related to flaw initiation, flaw propagation or structural ruin are controlled both by stress and toughness. While temperature may be physically modelled as a key determinant of toughness, residual uncertainty remains in the calibration of such a model with experimental data (Figure 3 -down). Predicting the overall lifetime or risk level requires turning such mixed statistical and phenomenological models for each failure mode into a combined risk measure, for instance the probability of failure over a given operation period.
Figure 3 Up: log of commissioning and failure dates of similar independent components of power plants (personal communication to the author) – down: dispersion of toughness or irradiated steel welds with temperature (adapted from Wallin, 1991).
The design, manufacturing and operation of new industrial systems, technological products and services are strong driving forces for the development of decision-support modelling. They also raise peculiar challenges regarding the confidence that may be placed in those tools and the associated issues of qualification, certification or optimisation: systems that do not yet exist involve by nature a type of unpredictability that resists direct statistical or evidence-based modelling. Various forms of engineering errors, manufacturing variability or deviations of operational use with respect to the initial plan are intertwined. Add the more fundamental lack of knowledge about the phenomenology of truly-innovative systems, particularly when involving human interaction, and one thus makes more complex the control of any model that can be thought of. Yet, a number of techniques of sensitivity analysis, calibration through the knowledge of sub-systems or the use of subjective probabilistic modelling will be introduced in that respect.
Eventually, complex systems at risk combine those various factors of uncertainty, variability and unpredictability. Accident analysis would typically require the consideration of a variety of external or internal uncertain disrupting events (hazards) that could resemble either the above-mentioned natural phenomena (e.g. flood, storm, tsunami, earthquakes, heat or coldwaves. . .) and/or failure of man-made components (independently or consequently to those external hazards) leading to the failure of a given system. Such an undesired event can henceforth entail a series of detrimental consequences through subsequent explosion, spill of liquids or gases and so on ending up in material and human costs or environmental impacts (such as plant unavailability costs, large release of pollutants, number of fatalities, etc.) depending on the level of exposure and vulnerability. Again, various types of uncertainty limit the control of those downstream consequences, be it due to the uncertain efficiency of man-made protection equipments or procedures, to variability of the meteorological or environmental conditions and/or of the presence of people (hence of exposure) at the unknown time of the accident, and or lack of knowledge of the eventual vulnerability to such pollutants or aggressions. Similar features of uncertainty characterise the environmental or health impact studies of man/made systems or infrastructure, not only conditional to a failure event but also under normal operation. The bow-tie scheme (see Figure 4) illustrates a common overview of the modelling approach to such analyses through the complex chaining of a fault tree from initiators to the central undesired event and of an event tree predicting the consequences from the central undesired event.
Figure 4 Failure tree/event tree model of hazard and consequences (adapted from source: ARAMIS European project (Salvi and Debray, 2006)).
Hazard and/or consequence prediction could link logical models, viz. fault or event trees, and phenomenological models, typically physical, structural, chemical, environmental or economic models describing quantitatively the elementary uncertain events involved. Eventually, the coupling of numerous models of all the sub-systems and events important for the understanding of a complex system end up in challenging computational programs. This is all the more so since joining their deterministic and probabilistic parts representing all kinds of phenomenology and uncertainty will require the joint use of numerical and probabilistic simulation techniques, as this book will explain in detail. Their implementation would generally require further simplification and approximation, hence giving rise to another source of propagation uncertainty stemming from the limitations of the computational budget rather than those of data or knowledge about the system. Although often neglected by the analyst any serious uncertainty modelling exercise ought never to introduce more uncertainty in the figures than is already in the system itself; the latter represents an objective limit for large-scale industrial studies: hopefully, some techniques discussed in this book can help in controlling it.
3 The Generic Methodological Approach of the Book
Notwithstanding such diversity of origins of uncertainty and risk phenomena within natural or man-made systems, this book will show that associated studies do involve similar key steps and a range of mathematical synergies. Indeed, quite distinct options remain regarding the peculiar choice of mathematical representation of uncertainty: this depends on the nature of the system and the precise specification of the variables of interest for the decision-maker. But the following generic framework will be a recurring concept throughout the book: the stakes in decision-making involve the control of the key outputs of interest of a system and the optimisation of associated actions (or designs, operation options . . .) faced with uncertainty in the states of the system (Figure 5).
Figure 5 The pre-existing or system model and its inputs/outputs.
Risk/uncertainty-conscious modelling mobilises a range of statistical, physical and numerical techniques designed to encode data and expertise in a combined phenomenological and probabilistic model (system and uncertainty model). This is meant to best represent the behaviour of the system outputs of interest under uncertainty given the variables of interest for the decision-maker and the scope of possible actions; hence, it aims to support valuable or accountable decisions – formally selecting d - with respect to a certain criterion cZ(d) which will be called the risk measure or quantity of interest (Figure 6).
Figure 6 The generic modelling framework.
Eventually model-based decision-making under a uncertainty or risk assessment are all about weighing relative uncertain options and a model should help in calculating the plausible (relative) likelihoods of the outcome. The risk measure or quantity of interest, a central concept of the book, is a quantitative figure summarising – mostly in a relative manner – the extent of uncertainty in the stake-linked variable of interest; associated quantities are the sensitivity indices or importance factors apportioning the relative importance of the different inputs in explaining the level of risk or uncertain spread of the output. They play a central role in fulfilling the four salient goals suggested to categorise the rationale of modelling a system under uncertainty: Understanding the behaviour of a system, Accrediting or validating a model, Selecting or optimising a set of actions and Complying with a given decision-making criterion or regulatory target. The key analytical steps involved in the performance of such studies and detailed in the chapters of the book are summarised in Figure 7.
Figure 7 Key generic steps of modelling in the context of risk and uncertainty -- corresponding chapters of the book.
4 Book Positioning and Related Literature
Probabilistic modelling for design, decision-support, risk assessment or uncertainty analysis has a long history. Pioneering projects in the 1980s relied on rather simplified mathematical representations of the systems, such as closed-form physics or simplified system reliability, involving a basic probabilistic treatment such as purely expert-based distributions or deterministic consequence modelling. Meanwhile, quality control enhancement and innovation programs in design and process engineering started with highly simplified statistical protocols or pure expertise-based tools. Thanks to the rapid development of computing resources, probabilistic approaches have gradually included more detailed physical-numerical models. Such complex modelling implies a finer calibration process through heterogeneous data sets or expertise. The large CPU time requirement is all the more demanding since the emerging regulatory specification is intended to adequately predict rare probabilities or tail uncertainties. Conversely, the importance of critical issues of parsimony or control of the risk of over-parameterisation of analytics exaggerating the level of confidence that may be placed in predictions through uncontrollably-complex models becomes all the greater. This gives rise to new challenges lying at the frontier between statistical modelling, physics, scientific computing and risk analysis.
Indeed, there seems to be insufficient co-operation between domain experts, physicists, statisticians, numerical experts and decision-support and risk analysts in applications. One of the concerns of this book is to go beyond the ‘black-box’ view developed in the underlying phenomenology of the environmental or industrial processes in most risk studies or asset management systems. Such an approach is thought to miss the modelling potential associated with an explicit description of their physical properties. Conversely, this book aims to lead environmental or engineering modellers facing new regulations from deterministic decision-making to more elaborate statistical and risk analysis material. This means, for instance, taking advantage of scientific computing enabling sophisticated estimation, calibration or simulation techniques, extreme event and robust estimators, dependence modelling and Bayesian algorithms. One of the central arguments of the book is that in-depth mathematical formulation of both physical properties and statistical modelling features inside a combined risk model brings scientific and regulatory value. Monotony, for instance, will be commented upon throughout the book as being both a very intuitive and rather powerful property for efficient and robust modelling, notwithstanding some limitations that can be circumvented through careful hypothesis-setting. Together with bounded behaviour or prior physical knowledge, such phenomenological knowledge proves to be helpful, for instance in tackling the challenges of estimator convergence and residual uncertainty.
An substantial literature has developed on modelling, uncertainty and risk analysis, robust design and decision-making under uncertainty. The particular focus of this book, besides concentrating on areas other than the extensively-addressed financial risks, is to provide new material at the frontier between statistics, computational and physical (or more generally phenomenological, meaning also biological, economic, etc.) modelling: this refers more specifically to the situation where information takes the form of statistical data as well as physical (or engineering) models. In such a context, the book develops a consistent mathematical approach that links the type of formulation common to risk and reliability analysts (e.g. Bedford and Cooke, 2001; Aven, 2003; Singpurwalla, 2006; Zio, 2009) with those used by structural reliability and mechanical engineering (Madsen et al., 1986; Rackwitz, 2001) uncertainty and sensitivity analysts (e.g. Granger Morgan and Henrion, 1990; Cooke, 1991; Helton, 1993; Helton et al., 1996; Saltelli et al., 2004; de Rocquigny, Devictor, and Tarantola, 2008), environmental and natural risk modellers (e.g. Krzystofowicz, 1983; Beck, 1987; Bernier, 1987; Beven and Binley, 1992; Hamby, 1994; Duckstein and Parent, 1994; Apel et al., 2004), statistical and computational research (e.g. de Groot, 1970; Kleijnen and Sargent, 2000; Coles, 2001; Kennedy and O'Hagan, 2002).
While being broadly introduced hereafter, the challenges of complex system reliability such as functional system complexity and dynamic system behaviour – that dominate the elementary physical behaviours in systems made of numerous inter-connected components and limit the significance of statistical data – will be documented in more detail in the specialised reliability literature (see the review in Zio, 2009). Correspondingly, this book is more centred on risk modelling than forecast modelling in the sense of long-term deviations due to rare events rather than short-term time correlation-based predictions that stand at the basis of weather forecasting, early-warning systems or economic forecasting (but see Chapter 6 for introductory comments). Additionally, for advanced sensitivity analysis involving statistical learning, design of computer experiments or Gaussian process modelling, the reader is referred to more specialised books such as Kleijnen, (2007) and Saltelli et al. (2008). The scientific material in this book refers essentially to applied mathematics, drawing on concepts from statistics, probability theory, as well as numerical analysis. Most of the proofs underlying the book's derivations are comprehensive, starting right from elementary results (see the Annexes for both a starter on statistical modelling and in-depth demonstrations or theoretical details); yet a thorough refresher might be obtained from the wealth of graduate-level mathematical textbooks or those more specific to probabilistic engineering (e.g. Ang and Tang, 1984).
Table 1 A roadmap through the book chapters.
To start with:Chapter 1 – review of current practices (natural risk, industry, environment, finance and economics)Chapter 3 – the generic pedagogical Flood exampleAnnex Section 10.1 – a refresh on statistical modellingChapter 2, Sections 2.1–2.3, 2.6–2.7 – overview of the general methodologyCentral chapters:Chapter 2 and 4 – foundations of modelling and the rationale of probabilistic risk measuresChapter 5 (Sections 5.1–5.3) and 6 (Sections 6.1–6.2) – estimation methodsChapters 7 (Section 7.2 + Section 7.5) and 8 – computational methods and decision-makingChapter 9 for conclusion and further researchAdvanced reading:Chapters 5 (Sections 5.4–5.5), 6 (Sections 6.3–6.5) and 7 (Sections 7.1, 7.3–7.4) – advanced statistical and computational methodsAnnexes Sections 10.2–10.5 for formal comments, detailed implementation and proofsExercises at the end of Chapters 2–85 A Reading Guide Through the Chapters
The concepts and formulations in this book have an essential grounding in practice throughout the variety of domains which are reviewed in Chapter 1: natural risk, industrial risk and process optimisation, metrology, environmental and sanitary protection, and engineering economics. Common key steps and considerable mathematical synergies will be generated within a generic modelling framework described in Chapter 2. Chapter 4 discusses the practical implementation of the various risk/uncertainty measures or criteria that all come into the generic framework, but correspond to different epistemological options coming from decision-theory. This relates in particular to the classical concern of distinguishing according to the aleatory or epistemic nature of uncertainty, specifying more clearly the variability to be covered by the risk measure, mixing deterministic and probabilistic settings, or specifying temporal conventions with the building of composite risk measures (confidence intervals on top of exceedance probabilities, peak events in time, etc.).
Hence, a number of statistical and computing challenges stand as generic to a number of situations. Estimation issues with samples and expertise are discussed in two chapters. Firstly, in Chapter 5 with the case where direct information is available on the uncertain variables or risk components. Simultaneous estimation of both aleatory and epistemic components involves classical or Bayesian statistics (also benefiting from expressing physical properties) that allow for much richer descriptions than elementary Gaussian or uniform models, up to rare events of unlimited variability through a careful use of the extreme value theory. Chapter 6 then discusses the estimation theory in the alternative case where only indirect information is available, in the sense that physical or functional models are to be inversed in order to retrieve the useful variables. In other words, inference of uncertainty pdf involves hidden input variables of large physical models for which only the outputs are observable. Inverse probabilistic problems include data assimilation – regression settings for calibration and full probabilistic inversion to identify the intrinsic variability of inputs. This is a spectrum of quite different motivation-based techniques for which clear distinctions are made and advanced research is discussed. Chapter 6 also introduces the distinctive features of dynamic models in forecasting and associated data assimilation inverse techniques.
Chapter 7 covers risk computation or uncertainty propagation of the risk measures once the statistical model of the sources of uncertainty has been estimated. Statistical-numerical or structural reliability procedures are investigated from the point of view of convergence within the physical models. Indeed, viewing the algorithms from the mapping of physical spaces, with properties such as partial monotony, or known bounds, allows for greater relevance in choosing the best compromises in applied large CPU models. Regarding sensitivity and importance analysis, a short review is given of the abundant and recent literature on global sensitivity analysis (such as Sobol', meta-modelling, etc.).
Chapter 8 formulates explicitly the use of the cost function in order to introduce a starter on methods for decision-making under uncertainty, involving the classical expected utility approaches, robust design, value of information and related alternatives in economic theory, as well as the peculiar issues arising when considering decision-making over time (discounting, real option analysis). Eventually an introduction is given to the adding of a layer of stochastic optimisation on top of uncertainty models, an essential area for industrial decision-making and also a large source of CPU complexity when implemented in real physical models. Although the book is not centred on finance or insurance applications, links to the basic financial and actuarial financial modelling concepts are also provided.
Chapter 3 stands apart from the rest of the book: it introduces a physically-based pedagogical example that illustrates most of the issues and methods developed throughout the book. This example, which originated in industrial tutorials and academic courses, represents schematically the flood risk in an industrial facility: hydraulics, sedimentology, climatology, civil engineering and economics are the phenomenological components, all of them simplified into easily understandable closed-form formulae. Numerical implementation details are provided to the reader as a support for tutorial exercises. Note that many algorithms introduced by the book, and illustrated in the tutorial example, can be found within the Open TURNS open source development platform (www.openturns.org) which was launched in close connection with the dissemination course and executive education programmes led by the author since 2005.
The Annexes successively provide: a refresher on statistical modelling for non-specialists; theoretical comments on the probabilistic foundations of the models, and reflections on the origins of macroscopic uncertainty; numerical results and further details of the pedagogical example; detailed mathematical derivations of some important results and tools for the algorithms.
Last but not least, most of the chapters of the book end with a self-contained series of exercises of various levels of complexity which are intended to serve both as (i) study questions for Master/PhD level or advanced executive education programmes as well as (ii) starting ideas, reflections and open challenges for further research, often illustrated by the closed-form example with full numerical details (see Annex). As will be discussed throughout the book as well as in the concluding Chapter 9, many scientific challenges remain open in the field; incidentally, note that the pedagogical example has already served as a benchmark case for a number of recent research papers.
Notes
1. The expression quantity of interest can be used in the context of any modelling activity (de Rocquigny, Devictor, and Tarantola, 2008) while the wording of a risk measure would probably better fit the context of decision-making under risk or risk assessment. Yet the two expressions refer to a strictly similar mathematical quantity (Chapter 2) and are thus completely synonymous throughout the book.
References
Ang and Tang, 1984 Ang, A.H.S. and Tang, W.H. (1984) Probability Concepts in Engineering, Planning and Design, vol. 2, John Wiley & Sons, Ltd., Chichester.
Apel et al., 2004 Apel, H., Thieken, A.H. et al. (2004) Flood risk assessment and associated uncertainty. Natural Hazard and Earth System Sciences, 4, 295–308.
Aven, 2003 Aven, T. (2003) Foundations of Risk Analysis, John Wiley & Sons, Ltd.
Beck, 1987 Beck, M.B. (1987) Water quality modelling: A review of the analysis of uncertainty. Water Resources Research, 23(8), 1393–1442.
Bedford and Cooke, 2001 Bedford, T.J. and Cooke, R.M. (2001) Probabilistic Risk Analysis – Foundations and Methods, Cambridge University Press.
Bernardara et al., 2010 Bernardara, P., de Rocquigny, E., Goutal, N. et al. (2010) Flood risk and uncertainty analysis: joint assessment of the hydrological & hydraulic components. Canadian Journal of Civil Engineering, 37(7), 968–979.
Bernier, J. (1987) Elements of bayesian analysis of uncertainty in reliability and risk models. In Duckstein, L., Plate E.J., Engineering reliability and risk in water resources, NATO ASI Series E: Applied Sciences, 124, 405–422.
Beven and Binley, 1992 Beven, K.J. and Binley, A.M. (1992) The future of distributed model: model calibration and uncertainty prediction. Hydrological Processes, 6, 279–298.
de Groot, M.H. (1970) Optimal Statistical Decisions, McGraw-Hill, NewYork.
de Rocquigny et al., 2008 de Rocquigny, E., Devictor, N. and Tarantola, S. (eds) (2008) Uncertainty in Industrial Practice, A Guide to Quantitative Uncertainty Management, Wiley.
Hamby, 1994 Goutal, N., Arnaud, A., Dugachard, M., de Rocquigny, E. and Bernardara, P. (2008) Discharge and Strickler coefficient uncertainty propagation in a one-dimensional free surface hydraulic model, Geophysical Research Abstracts, 10, EGU2008-A-11844.
Granger Morgan and Henrion, 1990 Granger Morgan, M. and Henrion, M. (1990) Uncertainty – A Guide to Dealing with Uncertainty in Quantitative Risk and Policy Analysis, Cambridge University Press.
Hamby, 1994 Hamby, D.M. (1994) A review of techniques for parameter sensitivity analysis of environmental models. Environmental Monitoring and Assessment, 32(2), 135–154.
Helton, 1993 Helton, J.C. (1993) Uncertainty and sensitivity analysis techniques for use in performance assessment for radioactive waste disposal. Reliability Engineering & System Safety, 42, 327–367.
Helton et al., 1996 Helton, J.C., Burmaster, D.E. et al. (1996) Treatment of aleatory and epistemic uncertainty, Special Issue of Rel. Eng. & Syst. Saf., 54(2–3).
Helton and Oberkampf, 2004 Helton, J.C. and Oberkampf, W.L. (2004) Alternative representations of epistemic uncertainty. Special Issue of Reliability Engineering & System Safety, 85(1–3).
Kennedy and O'Hagan, 2002 Kennedy, M.C. and O'Hagan, A. (2002) Bayesian calibration of computer models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 63(3), 425–464.
Kleijnen, 2007 Kleijnen, J.P.C. (2007) Design and Analysis of Simulation Experiments, International Series in Operations Research & Management Science, Springer.
Knight, 1921 Knight, F.H. (1921) Risk, Uncertainty and Profit, Hart, Schaffner & Marx.
Krzystofowicz, R. (1983) Why should a forecaster and a decision maker use Bayesian theorem, Water Resour Res, 19(2), 327–336.
Oberkampf et al., 2002 Oberkampf, W.L., DeLand, S.M., Rutherford, B.M., et al. (2002) Error and uncertainty in modelling and simulation. Special Issue of Reliability Engineering & System Safety, 75(3), 333–357.
Saltelli et al., 2008 Saltelli, A., Ratto, M., Andres, T. et al. (2008) Global Sensitivity Analysis: The Primer, Wiley.
Singpurwalla, 2006 Singpurwalla, N.D. (2006) Risk and Reliability: A Bayesian Perspective, John Wiley & Sons, Chichester.
Wallin, 1991 Wallin, K. (1991) Irradiation damage effects on the fracture toughness transition curve shape for reactor pressure vessel steels. Joint FEFG/ICF International Conference on Fracture of Engineering Materials and Structures, Singapore.
Zio, 2009 Taleb, N.N. (2007) The Black Swan: The Impact of the Highly Improbable, Random House, ISBN 978-1-4000-6351-2.
Zio, 2009 Zio, E. (2009) Computational Methods for Reliability and Risk Analysis, Series on Quality, Reliability and Engineering Statistics, vol. 14, World Scientific Publishing Co. Pte. Ltd., Singapore.
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
