117,99 €
Entropy Theory and its Application in Environmental and Water Engineering responds to the need for a book that deals with basic concepts of entropy theory from a hydrologic and water engineering perspective and then for a book that deals with applications of these concepts to a range of water engineering problems. The range of applications of entropy is constantly expanding and new areas finding a use for the theory are continually emerging. The applications of concepts and techniques vary across different subject areas and this book aims to relate them directly to practical problems of environmental and water engineering.
The book presents and explains the Principle of Maximum Entropy (POME) and the Principle of Minimum Cross Entropy (POMCE) and their applications to different types of probability distributions. Spatial and inverse spatial entropy are important for urban planning and are presented with clarity. Maximum entropy spectral analysis and minimum cross entropy spectral analysis are powerful techniques for addressing a variety of problems faced by environmental and water scientists and engineers and are described here with illustrative examples.
Giving a thorough introduction to the use of entropy to measure the unpredictability in environmental and water systems this book will add an essential statistical method to the toolkit of postgraduates, researchers and academic hydrologists, water resource managers, environmental scientists and engineers. It will also offer a valuable resource for professionals in the same areas, governmental organizations, private companies as well as students in earth sciences, civil and agricultural engineering, and agricultural and rangeland sciences.
This book:
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 927
Veröffentlichungsjahr: 2013
Table of Contents
Title Page
Copyright
Dedication
Preface
Acknowledgments
Chapter 1: Introduction
1.1 Systems and their characteristics
1.2 Informational entropies
1.3 Entropy, information, and uncertainty
1.4 Types of uncertainty
1.5 Entropy and related concepts
Question
References
Additional References
Chapter 2: Entropy Theory
2.1 Formulation of entropy
2.2 Shannon entropy
2.3 Connotations of information and entropy
2.4 Discrete entropy: univariate case and marginal entropy
2.5 Discrete entropy: bivariate case
2.6 Dimensionless entropies
2.7 Bayes theorem
2.8 Informational correlation coefficient
2.9 Coefficient of nontransferred information
2.10 Discrete entropy: multidimensional case
2.11 Continuous entropy
2.12 Stochastic processes and entropy
2.13 Effect of proportional class interval
2.14 Effect of the form of probability distribution
2.15 Data with zero values
2.16 Effect of measurement units
2.17 Effect of averaging data
2.18 Effect of measurement error
2.19 Entropy in frequency domain
2.20 Principle of maximum entropy
2.21 Concentration theorem
2.22 Principle of minimum cross entropy
2.23 Relation between entropy and error probability
2.24 Various interpretations of entropy
2.25 Relation between entropy and variance
2.26 Entropy power
2.27 Relative frequency
2.28 Application of entropy theory
Questions
References
Additional Reading
Chapter 3: Principle of Maximum Entropy
3.1 Formulation
3.2 POME formalism for discrete variables
3.3 POME formalism for continuous variables
3.4 POME formalism for two variables
3.5 Effect of constraints on entropy
3.6 Invariance of total entropy
Questions
References
Additional Reading
Chapter 4: Derivation of Pome-Based Distributions
4.1 Discrete variable and discrete distributions
4.2 Continuous variable and continuous distributions
Questions
References
Additional Reading
Chapter 5: Multivariate Probability Distributions
5.1 Multivariate normal distributions
5.2 Multivariate exponential distributions
5.3 Multivariate distributions using the entropy-copula method
5.4 Copula entropy
Question
References
Additional Reading
Chapter 6: Principle of Minimum Cross-Entropy
6.1 Concept and formulation of POMCE
6.2 Properties of POMCE
6.3 POMCE formalism for discrete variables
6.4 POMCE formulation for continuous variables
6.5 Relation to POME
6.6 Relation to mutual information
6.7 Relation to variational distance
6.8 Lin's directed divergence measure
6.9 Upper bounds for cross-entropy
Question
References
Additional Reading
Chapter 7: Derivation of POME-Based Distributions
7.1 Discrete variable and mean E[x] as a constraint
7.2 Discrete variable taking on an infinite set of values
7.3 Continuous variable: general formulation
Question
References
Chapter 8: Parameter Estimation
8.1 Ordinary entropy-based parameter estimation method
8.2 Parameter-space expansion method
8.3 Contrast with method of maximum likelihood estimation (MLE)
8.4 Parameter estimation by numerical methods
Questions
References
Additional Reading
Chapter 9: Spatial Entropy
9.1 Organization of spatial data
9.2 Spatial entropy statistics
9.3 One dimensional aggregation
9.4 Another approach to spatial representation
9.5 Two-dimensional aggregation
9.6 Entropy maximization for modeling spatial phenomena
9.7 Cluster analysis by entropy maximization
9.8 Spatial visualization and mapping
9.9 Scale and entropy
9.10 Spatial probability distributions
9.11 Scaling: rank size rule and Zipf's law
Questions
References
Further Reading
Chapter 10: Inverse Spatial Entropy
10.1 Definition
10.2 Principle of entropy decomposition
10.3 Measures of information gain
10.4 Aggregation properties
10.5 Spatial interpretations
10.6 Hierarchical decomposition
10.7 Comparative measures of spatial decomposition
Questions
References
Chapter 11: Entropy Spectral Analyses
11.1 Characteristics of time series
11.2 Spectral analysis
11.3 Spectral analysis using maximum entropy
11.4 Spectral estimation using configurational entropy
11.5 Spectral estimation by mutual information principle
References
Additional Reading
Chapter 12: Minimum Cross Entropy Spectral Analysis
12.1 Cross-entropy
12.2 Minimum cross-entropy spectral analysis (MCESA)
12.3 Minimum cross-entropy power spectrum given auto-correlation
12.4 Cross-entropy between input and output of linear filter
12.5 Comparison
12.6 Towards efficient algorithms
12.7 General method for minimum cross-entropy spectral estimation
References
Additional References
Chapter 13: Evaluation and Design of Sampling and Measurement Networks
13.1 Design considerations
13.2 Information-related approaches
13.3 Entropy measures
13.4 Directional information transfer index
13.5 Total correlation
13.6 Maximum information minimum redundancy (MIMR)
Question
References
Additional Reading
Chapter 14: Selection of Variables and Models
14.1 Methods for selection
14.2 Kullback-Leibler (KL) distance
14.3 Variable selection
14.4 Transitivity
14.5 Logit model
14.6 Risk and vulnerability assessment
Questions
References
Additional Reading
Chapter 15: Neural Networks
15.1 Single neuron
15.2 Neural network training
15.3 Principle of maximum information preservation
15.4 A single neuron corrupted by processing noise
15.5 A single neuron corrupted by additive input noise
15.6 Redundancy and diversity
15.7 Decision trees and entropy nets
Questions
References
Chapter 16: System Complexity
16.1 Ferdinand's measure of complexity
16.2 Kapur's complexity analysis
16.3 Cornacchio's generalized complexity measures
16.4 Kapur's simplification
16.5 Kapur's measure
16.6 Hypothesis testing
16.7 Other complexity measures
Questions
References
Additional References
Author Index
Subject Index
This edition first published 2013 © 2013 by John Wiley and Sons, Ltd
Wiley-Blackwell is an imprint of John Wiley & Sons, formed by the merger of Wiley's global Scientific, Technical and Medical business with Blackwell Publishing.
Registered office: John Wiley & Sons, Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK
Editorial offices: 9600 Garsington Road, Oxford, OX4 2DQ, UK
The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK
111 River Street, Hoboken, NJ 07030-5774, USA
For details of our global editorial offices, for customer services and for information about how to apply for permission to reuse the copyright material in this book please see our website at www.wiley.com/wiley-blackwell.
The right of the author to be identified as the author of this work has been asserted in accordance with the UK Copyright, Designs and Patents Act 1988.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by the UK Copyright, Designs and Patents Act 1988, without the prior permission of the publisher.
Designations used by companies to distinguish their products are often claimed as trademarks. All brand names and product names used in this book are trade names, service marks, trademarks or registered trademarks of their respective owners. The publisher is not associated with any product or vendor mentioned in this book. This publication is designed to provide accurate and authoritative information in regard to the subject matter covered. It is sold on the understanding that the publisher is not engaged in rendering professional services. If professional advice or other expert assistance is required, the services of a competent professional should be sought.
Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with the respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. It is sold on the understanding that the publisher is not engaged in rendering professional services and neither the publisher nor the author shall be liable for damages arising herefrom. If professional advice or other expert assistance is required, the services of a competent professional should be sought.
Library of Congress Cataloging-in-Publication Data
Singh, V. P. (Vijay P.)
Entropy theory and its application in environmental and water engineering / Vijay P. Singh.
pages cm
Includes bibliographical references and indexes.
ISBN 978-1-119-97656-1 (cloth)
1. Hydraulic engineering— Mathematics. 2. Water— Thermal properties— Mathematical models. 3. Hydraulics— Mathematics. 4. Maximum entropy method— Congresses. 5. Entropy. I. Title.
TC157.8.S46 2013
627.01$′$53673— dc23
2012028077
Dedicated to
My wife Anita,
Since the pioneering work of Shannon in 1948 on the development of informational entropy theory and the landmark contributions of Kullback and Leibler in 1951 leading to the development of the principle of minimum cross-entropy, of Lindley in 1956 leading to the development of mutual information, and of Jaynes in 1957–8 leading to the development of the principle of maximum entropy and theorem of concentration, the entropy theory has been widely applied to a wide spectrum of areas, including biology, genetics, chemistry, physics and quantum mechanics, statistical mechanics, thermodynamics, electronics and communication engineering, image processing, photogrammetry, map construction, management sciences, operations research, pattern recognition and identification, topology, economics, psychology, social sciences, ecology, data acquisition and storage and retrieval, fluid mechanics, turbulence modeling, geology and geomorphology, geophysics, geography, geotechnical engineering, hydraulics, hydrology, reliability analysis, reservoir engineering, transportation engineering, and so on. New areas finding application of entropy have since continued to unfold. The entropy theory is indeed versatile and its application is widespread.
In the area of hydrologic and environmental sciences and water engineering, a range of applications of entropy have been reported during the past four and half decades, and new topics applying entropy are emerging each year. There are many books on entropy written in the fields of statistics, communication engineering, economics, biology and reliability analysis. These books have been written with different objectives in mind and for addressing different kinds of problems. Application of entropy concepts and techniques discussed in these books to hydrologic science and water engineering problems is not always straightforward. Therefore, there exists a need for a book that deals with basic concepts of entropy theory from a hydrologic and water engineering perspective and then for a book that deals with applications of these concepts to a range of water engineering problems. Currently there is no book devoted to covering basic aspects of the entropy theory and its application in hydrologic and environmental sciences and water engineering. This book attempts to fill this need.
Much of the material in the book is derived from lecture notes prepared for a course on entropy theory and its application in water engineering taught to graduate students in biological and agricultural engineering, civil and environmental engineering, and hydrologic science and water management at Texas, A & M University, College Station, Texas. Comments, critics and discussions offered by the students have, to some extent, influenced the style of presentation in the book.
The book is divided into 16 chapters. The first chapter introduces the concept of entropy. Providing a short discussion of systems and their characteristics, the chapter goes on to discuss different types of entropies; and connection between information, uncertainty and entropy; and concludes with a brief treatment of entropy-related concepts. Chapter 2 presents the entropy theory, including formulation of entropy and connotations of information and entropy. It then describes discrete entropy for univariate, bivariate and multidimensional cases. The discussion is extended to continuous entropy for univariate, bivariate and multivariate cases. It also includes a treatment of different aspects that influence entropy. Reflecting on the various interpretations of entropy, the chapter provides hints of different types of applications.
The principle of maximum entropy (POME) is the subject matter of Chapter 3, including the formulation of POME and the development of the POME formalism for discrete variables, continuous variables, and two variables. The chapter concludes with a discussion of the effect of constraints on entropy and invariance of entropy. The derivation of POME-based discrete and continuous probability distributions under different constraints constitutes the discussion in Chapter 4. The discussion is extended to multivariate distributions in Chapter 5. First, the discussion is restricted to normal and exponential distributions and then extended to multivariate distributions by combining the entropy theory with the copula method.
Chapter 6 deals with the principle of minimum cross-entropy (POMCE). Beginning with the formulation of POMCE, it discusses properties and formalism of POMCE for discrete and continuous variables and relation to POME, mutual information and variational distance. The discussion on POMCE is extended to deriving discrete and continuous probability distributions under different constraints and priors in Chapter 7. Chapter 8 presents entropy-based methods for parameter estimation, including the ordinary entropy-based method, the parameter-space expansion method, and a numerical method.
Spatial entropy is the subject matter of Chapter 9. Beginning with a discussion of the organization of spatial data and spatial entropy statistics, it goes on to discussing one-dimensional and two-dimensional aggregation, entropy maximizing for modeling spatial phenomena, cluster analysis, spatial visualization and mapping, scale and entropy and spatial probability distributions. Inverse spatial entropy is dealt with in Chapter 10. It includes the principle of entropy decomposition, measures of information gain, aggregate properties, spatial interpretations, hierarchical decomposition, and comparative measures of spatial decomposition.
Maximum entropy-based spectral analysis is presented in Chapter 11. It first presents the characteristics of time series, and then discusses spectral analyses using the Burg entropy, configurational entropy, and mutual information principle. Chapter 12 discusses minimum cross-entropy spectral analysis. Presenting the power spectrum probability density function first, it discusses minimum cross-entropy-based power spectrum given autocorrelation, and cross-entropy between input and output of linear filter, and concludes with a general method for minimum cross-entropy spectral estimation.
Chapter 13 presents the evaluation and design of sampling and measurement networks. It first discusses design considerations and information-related approaches, and then goes on to discussing entropy measures and their application, directional information transfer index, total correlation, and maximum information minimum redundancy (MIMR).
Selection of variables and models constitutes the subject matter of Chapter 14. It presents the methods of selection, the Kullback–Leibler (KL) distance, variable selection, transitivity, logit model, and risk and vulnerability assessment. Chapter 15 is on neural networks comprising neural network training, principle of maximum information preservation, redundancy and diversity, and decision trees and entropy nets. Model complexity is treated in Chapter 16. The complexity measures discussed include Ferdinand's measure of complexity, Kapur's complexity measure, Cornacchio's generalized complexity measure and other complexity measures.
Vijay P. SinghCollege Station, Texas
Nobody can write a book on entropy without being indebted to C.E. Shannon, E.T. Jaynes, S. Kullback, and R.A. Leibler for their pioneering contributions. In addition, there are a multitude of scientists and engineers who have contributed to the development of entropy theory and its application in a variety of disciplines, including hydrologic science and engineering, hydraulic engineering, geomorphology, environmental engineering, and water resources engineering—some of the areas of interest to me. This book draws upon the fruits of their labor. I have tried to make my acknowledgments in each chapter as specific as possible. Any omission on my part has been entirely inadvertent and I offer my apologies in advance. I would be grateful if readers would bring to my attention any discrepancies, errors, or misprints.
Over the years I have had the privilege of collaborating on many aspects of entropy-related applications with Professor Mauro Fiorentino from University of Basilicata, Potenza, Italy; Professor Nilgun B. Harmancioglu from Dokuz Elyul University, Izmir, Turkey; and Professor A.K. Rajagopal from Naval Research Laboratory, Washington, DC. I learnt much from these colleagues and friends.
During the course of two and a half decades I have had a number of graduate students who worked on entropy-based modeling in hydrology, hydraulics, and water resources. I would particularly like to mention Dr. Felix C. Kristanovich now at Environ International Corporation, Seattle, Washington; and Mr. Kulwant Singh at University of Houston, Texas. They worked with me in the late 1980s on entropy-based distributions and spectral analyses. Several of my current graduate students have helped me with preparation of notes, especially in the solution of example problems, drawing of figures, and review of written material. Specifically, I would like to express my gratitude to Mr. Zengchao Hao for help with Chapters 2, 4, 5, and 11; Mr. Li Chao for help with Chapters 2, 9, 10, 13; Ms. Huijuan Cui for help with Chapters 11 and 12; Mr. D. Long for help with Chapters 8 and 9; Mr. Juik Koh for help with Chapter 16; and Mr. C. Prakash Khedun for help with text formatting, drawings and examples. I am very grateful to these students. In addition, Dr. L. Zhang from University of Akron, Akron, Ohio, reviewed the first five chapters and offered many comments. Dr. M. Ozger from Technical University of Istanbul, Turkey; and Professor G. Tayfur from Izmir Institute of Technology, Izmir, Turkey, helped with Chapter 13 on neural networks.
My family members—brothers and sisters in India—have been a continuous source of inspiration. My wife Anita, son Vinay, daughter-in-law Sonali, grandson Ronin, and daughter Arti have been most supportive and allowed me to work during nights, weekends, and holidays, often away from them. They provided encouragement, showed patience, and helped in myriad ways. Most importantly, they were always there whenever I needed them, and I am deeply grateful. Without their support and affection, this book would not have come to fruition.
Vijay P. SinghCollege Station, Texas
Beginning with a short introduction of systems and system states, this chapter presents concepts of thermodynamic entropy and statistical-mechanical entropy, and definitions of informational entropies, including the Shannon entropy, exponential entropy, Tsallis entropy, and Renyi entropy. Then, it provides a short discussion of entropy-related concepts and potential for their application.
In thermodynamics a system is defined to be any part of the universe that is made up of a large number of particles. The remainder of the universe then is referred to as surroundings. Thermodynamics distinguishes four classes of systems, depending on the constraints imposed on them. The classification of systems is based on the transfer of (i) matter, (ii) heat, and/or (iii) energy across the system boundaries (Denbigh, 1989). The four classes of systems, as shown in Figure 1.1, are: (1) Isolated systems: These systems do not permit exchange of matter or energy across their boundaries. (2) Adiabatically isolated systems: These systems do not permit transfer of heat (also of matter) but permit transfer of energy across the boundaries. (3) Closed systems: These systems do not permit transfer of matter but permit transfer of energy as work or transfer of heat. (4) Open systems: These systems are defined by their geometrical boundaries which permit exchange of energy and heat together with the molecules of some chemical substances.
Figure 1.1 Classification of systems.
The second law of thermodynamics states that the entropy of a system can only increase or remain constant; this law applies to only isolated or adiabatically isolated systems. The vast majority of systems belong to class (4). Isolation and closedness are not rampant in nature.
There are two states of a system: microstate and macrostate. A system and its surroundings can be isolated from each other, and for such a system there is no interchange of heat or matter with its surroundings. Such a system eventually reaches a state of equilibrium in a thermodynamic sense, meaning no significant change in the state of the system will occur. The state of the system here refers to the macrostate, not microstate at the atomic scale, because the microstate of such a system will continuously change. The macrostate is a thermodynamic state which can be completely described by observing thermodynamic variables, such as pressure, volume, temperature, and so on. Thus, in classical thermodynamics, a system is described by its macroscopic state entailing experimentally observable properties and the effects of heat and work on the interaction between the system and its surroundings. Thermodynamics does not distinguish between various microstates in which the system can exist, and hence does not deal with the mechanisms operating at the atomic scale (Fast, 1968). For a given thermodynamic state there can be many microstates. Thermodynamic states are distinguished when there are measurable changes in thermodynamic variables.
Whenever a system is undergoing a change because of introduction of heat or extraction of heat or any other reason, changes of state of the system can be of two types: reversible and irreversible. As the name suggests, reversible means that any kind of change occurring during a reversible process in the system and its surroundings can be restored by reversing the process. For example, changes in the system state caused by the addition of heat can be restored by the extraction of heat. On the contrary, this is not true in the case of irreversible change of state in which the original state of the system cannot be regained without making changes in the surroundings. Natural processes are irreversible processes. For processes to be reversible, they must occur infinitely slowly.
It may be worthwhile to visit the first law of thermodynamics, also called the law of conservation of energy, which was based on the transformation of work and heat into one another. Consider a system which is not isolated from its surroundings, and let a quantity of heat be introduced to the system. This heat performs work denoted as . If the internal energy of the system is denoted by , then and will lead to an increase in The work performed may be of mechanical, electrical, chemical, or magnetic nature, and the internal energy is the sum of kinetic energy and potential energy of all particles that the system is made up of. If the system passes from an initial state 1 to a final state 2, then, It should be noted that the integral depends on the initial and final states but the integrals and also depend on the path followed. Since the system is not isolated and is interactive, there will be exchanges of heat and work with the surroundings. If the system finally returns to its original state, then the sum of integral of heat and integral of work will be zero, meaning the integral of internal energy will also be zero, that is, or Were it not the case, the energy would either be created or destroyed. The internal energy of a system depends on pressure, temperature, volume, chemical composition, and structure which define the system state and does not depend on the prior history.
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
