139,99 €
Markov Chains: Theory and Applications
Markov chains are a fundamental class of stochastic processes. They are widely used to solve problems in a large number of domains such as operational research, computer science, communication networks and manufacturing systems. The success of Markov chains is mainly due to their simplicity of use, the large number of available theoretical results and the quality of algorithms developed for the numerical evaluation of many metrics of interest.
The author presents the theory of both discrete-time and continuous-time homogeneous Markov chains. He carefully examines the explosion phenomenon, the Kolmogorov equations, the convergence to equilibrium and the passage time distributions to a state and to a subset of states. These results are applied to birth-and-death processes. He then proposes a detailed study of the uniformization technique by means of Banach algebra. This technique is used for the transient analysis of several queuing systems.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 494
Veröffentlichungsjahr: 2013
Contents
Preface
Chapter 1. Discrete-Time Markov Chains
1.1. Definitions and properties
1.2. Strong Markov property
1.3. Recurrent and transient states
1.4. State classification
1.5. Visits to a state
1.6. State space decomposition
1.7. Irreducible and recurrent Markov chains
1.8. Aperiodic Markov chains
1.9. Convergence to equilibrium
1.10. Ergodic theorem
1.11. First passage times and number of visits
1.12. Finite Markov chains
1.13. Absorbing Markov chains
1.14. Examples
1.15. Bibliographical notes
Chapter 2. Continuous-Time Markov Chains
2.1. Definitions and properties
2.2. Transition functions and infinitesimal generator
2.3. Kolmogorov’s backward equation
2.4. Kolmogorov’s forward equation
2.5. Existence and uniqueness of the solutions
2.6. Recurrent and transient states
2.7. State classification
2.8. Explosion
2.9. Irreducible and recurrent Markov chains
2.10. Convergence to equilibrium
2.11. Ergodic theorem
2.12. First passage times
2.13. Absorbing Markov chains
2.14. Bibliographical notes
Chapter 3. Birth-and-Death Processes
3.1. Discrete-time birth-and-death processes
3.2. Absorbing discrete-time birth-and-death processes
3.3. Periodic discrete-time birth-and-death processes
3.4. Continuous-time pure birth processes
3.5. Continuous-time birth-and-death processes
3.6. Absorbing continuous-time birth-and-death processes
3.7. Bibliographical notes
Chapter 4. Uniformization
4.1. Introduction
4.2. Banach spaces and algebra
4.3. Infinite matrices and vectors
4.4. Poisson process
4.5. Uniformizable Markov chains
4.6. First passage time to a subset of states
4.7. Finite Markov chains
4.8. Transient regime
4.9. Bibliographical notes
Chapter 5. Queues
5.1. The M/M/1 queue
5.2. The M/M/c queue
5.3. The M/M/∞ queue
5.4. Phase-type distributions
5.5. Markovian arrival processes
5.6. Batch Markovian arrival process
5.7. Block-structured Markov chains
5.8. Applications
5.9. Bibliographical notes
Appendix:. Basic Results
Bibliography
Index
I dedicate this book especially to two exceptional people, my father and my mother.
First published 2013 in Great Britain and the United States by ISTE Ltd and John Wiley & Sons, Inc.
Apart from any fair dealing for the purposes of research or private study, or criticism or review, as permitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced, stored or transmitted, in any form or by any means, with the prior permission in writing of the publishers, or in the case of reprographic reproduction in accordance with the terms and licenses issued by the CLA. Enquiries concerning reproduction outside these terms should be sent to the publishers at the undermentioned address:
ISTE Ltd 27-37 St George’s Road London SW19 4EU UKwww.iste.co.uk
John Wiley & Sons, Inc.111 River StreetHoboken, NJ 07030USAwww.wiley.com
© ISTE Ltd 2013
The rights of Bruno Sericola to be identified as the author of this work have been asserted by him in accordance with the Copyright, Designs and Patents Act 1988.
Library of Congress Control Number: 2013936313
British Library Cataloguing-in-Publication DataA CIP record for this book is available from the British Library ISBN: 978-1-84821-493-4
Preface
Markov chains are a fundamental class of stochastic processes. They are very important and widely used to solve problems in a large number of domains such as operational research, computer science and distributed systems, communication networks, biology, physics, chemistry, economics, finance and social sciences. The success of Markov chains is mainly due to their simplicity of use, the large number of available theoretical results and the quality of algorithms developed for the numerical evaluation of various metrics associated with them.
The Markov property means that, for a fixed stochastic process, if the state of the process is known at a given time then its past and future, with respect to this time, are independent. In other words, if the state of the process is known at a given time, predicting its future with regard to this point does not require any information about its past. This property allows for a considerable reduction of parameters necessary to represent the evolution of a system modeled by such a process. It is simple enough for the modeling of systems to be natural and intuitive but also very rich in that it allows us to take into account general probability distributions in a very precise manner.
This flexibility in modeling also allows us to consider phenomena such as synchronization or, more generally, stochastic dependencies between components of a system or between a system and its environment. However, this flexibility of use may lead to either an increase in the number of states of the process in the finite case or an increase in its structural complexity in the infinite case.
This book is devoted to the study of discrete-time and continuous-time Markov chains on a countable state space. This study is both theoretical and practical including applications to birth-and-death processes and queuing theory. It is addressed to all researchers and engineers in need of stochastic models to evaluate and predict the behavior of systems they develop. It is particularly appropriate for academics, students, engineers and researchers in computer science, communication networks and applied probability.
It is structured as follows. Chapter 1 deals with discrete-time Markov chains. We describe not only their stationary behavior with the study of convergence to equilibrium and ergodic theorem but also their transient behavior together with the study of the first passage times to a state and a subset of states. We also consider in this chapter finite Markov chains as well as absorbing Markov chains. Finally, three examples are proposed and covered thoroughly.
Chapter 2 discusses continuous-time Markov chains. We detail precisely the way the backward and forward Kolmogorov equations are obtained and examine the existence and uniqueness of their solutions. We also treat in this chapter the phenomenon of explosion. This occurs when the Markov chain undergoes infnitely many jumps in a finite interval of time. We show the way to obtain the main results related to this phenomenon. As with the discrete case, we describe the stationary behavior of these chains with the study of convergence to equilibrium and ergodic theorem. We fnally propose an analysis of the first passage times to a state and a subset of states as well as a study of absorbing Markov chains.
Chapter 3 is devoted to the particular case of birth-and-death processes. These processes are characterized by a tridiagonal transition probability matrix, in the discrete-time case, and by a tridiagonal transition rate matrix, in the continuous-time case. In this chapter, we apply the results obtained from Chapters 1 and 2 concerning passage times and the average number of visits. We examine the way to obtain the explosion conditions in function of the transition rates and we show that the existence of an invariant probability does not ensure that the chain will be non-explosive. Finally, we give examples of positive recurrent and null recurrent continuous-time birth-and-death processes for which the embedded chains no longer possess these properties.
Chapter 4 deals with uniformization that, for a given continuous-time Markov chain, consists of the construction of a stochastically equivalent chain such that the sojourn times in each state have the same exponential distribution. This equivalent chain connects continuous time to discrete time by the sole intermediary of the Poisson process that we examine carefully. Nevertheless, not every Markov chain can be uniformized. This requires that the sequence of exit rates of each state be bounded. We are then placed inside the framework of Banach spaces and algebra allowing the manipulation of infinite matrices and vectors. This property of uniformization is of particular importance because it allows simple and accurate numerical evaluation of various metrics such as state probabilities of the given Markov chain and the distribution of the first passage times to a subset of states for which we provide the associated computational algorithms both in the general case and in the particular case of uniformizable birth-and-death processes.
Chapter 5 discusses the transient behavior of Markovian queues mainly for calculating state probabilities at a given time as well as for calculating the distribution of busy periods of (a) server(s). We first consider the M/M/1 queue for which we obtain simple formulas using generating functions. These techniques do not directly extend to the M/M/c queue, in which case we recommend the use of the algorithms proposed in Chapter 4 for the uniformizable birth-and-death processes. The M/M/∞ queue does not lead to a uniformizable Markov chain but its state probabilities at every instant are obtained in a simple manner. The distribution of the busy periods of the servers is more diffcult to obtain. The other queues that we propose to analyze are more general but they lead to uniformizable Markov chains. Their sometimes complex structure generates block-structured Markov chains whose transient behavior will be examined carefully. The treatment of these complex queues is motivated by their use in the domain of performance evaluation of communication networks.
Each chapter ends with bibliographic notes allowing the reader to complete or pursue the study of certain specific aspects of his or her choice. Finally, an appendix summarizes the basic results of integration and probability theory used throughout the book.
There are many books on Markov chains, which generally deal with steady-state analysis. The uniqueness of this book lies in the fact that it offers, in addition, a detailed study of the first explosion time, backward and forward Kolmogorov equations, birth-and-death processes as well as of uniformizable Markov chains and the treatment of transient behavior with associated algorithms and applications to general queues.
I would like to end this preface by thanking Nikolaos Limnios, who heads this collection, for his proposal to carry out this work. I also thank very warmly the reviewers François Castella, Jean-Louis Marchand and Coralie Sericola for their valuable work and the great relevance of their numerous comments and suggestions. Last but not least, my thoughts go to my wife, my two children, my brother and all my family who supported me all along this work.
All the following Markov chains are considered homogeneous. The term Markov chain in this chapter will thus designate a homogeneous discrete-time Markov chain.
We then have, by definition, for all i, j ∈ S,
For all i ∈ S, we thus have:
[1.1]
X is, therefore, a Markov chain with initial distribution α and transition probability matrix P.
This result shows that a discrete-time Markov chain is completely determined by its initial distribution α and transition probability matrix P.
In the following sections, we will often use products of infinite matrices or vector-matrix or matrix-vector products of infinite dimension. Remember that, in general, these products are not associative except when the affected matrices or vectors have non-negative coefficients, which is the case in this chapter. More details on this subject are given in the first chapter of [KEM 66] and in section 4.3.
THEOREM 1.2.– If X is a Markov chain on the state space S, with initial distribution α and transition probability matrix P then, for all i, j ∈ S and, for all n ≥ 0, we have:
PROOF.–
1) For all m, n ≥ 0, we have:
where the third equality uses the Markov property and the fourth uses the homogeneity of X.
that is if P (n) denotes the matrix with coefficients Pi,j (n),
2) We obtain, using point 1,
which completes the proof.
In particular, this result shows that if P is stochastic then Pn is also stochastic, for all n ≥ 2.
THEOREM 1.3.– If X is a Markov chain then, for all n ≥ 0, 0 ≤ k ≤ n, m ≥ 1, for all ik,…, in ∈ S and j1,…, jm ∈ S, we have:
PROOF.– Using theorem 1.1, we have:
which completes the proof.
The Markov property seen so far stated that the past and the future are independent when the present is known at a given deterministic time n. The strong Markov property allows us to extend this independence when the present is known at a particular random time which is called a stopping time.
where, for a set E, P(E) denotes the set of all subsets of E and Sn+1 is the set of all (n + 1)-dimensional vectors, whose entries are states of S. For all i ∈ S, we write the probability distribution concentrated on the state i, defined by:
.
PROOF.– It is sufficient to prove the result when:
which completes the proof.
DEFINITION 1.3.– .
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
