122,99 €
A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation of all tracking algorithms within a Bayesian framework and describes effective numerical methods for evaluating density-weighted integrals, including linear and nonlinear Kalman filters for Gaussian-weighted integrals and particle filters for non-Gaussian cases. The author first emphasizes detailed derivations from first principles of eeach estimation method and goes on to use illustrative and detailed step-by-step instructions for each method that makes coding of the tracking filter simple and easy to understand. Case studies are employed to showcase applications of the discussed topics. In addition, the book supplies block diagrams for each algorithm, allowing readers to develop their own MATLAB® toolbox of estimation methods. Bayesian Estimation and Tracking is an excellent book for courses on estimation and tracking methods at the graduate level. The book also serves as a valuable reference for research scientists, mathematicians, and engineers seeking a deeper understanding of the topics.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 398
Veröffentlichungsjahr: 2012
Contents
Cover
Title Page
Copyright
Dedication
Preface
Acknowledgments
List of Figures
List of Tables
Part I: Preliminaries
Chapter 1: Introduction
1.1 Bayesian Inference
1.2 Bayesian Hierarchy of Estimation Methods
1.3 Scope of this Text
1.4 Modeling and Simulation with Matlab®
References
Chapter 2: Preliminary Mathematical Concepts
2.1 A Very Brief Overview of Matrix Linear Algebra
2.2 Vector Point Generators
2.3 Approximating Nonlinear Multidimensional Functions with Multidimensional Arguments
2.4 Overview of Multivariate Statistics
References
Chapter 3: General Concepts of Bayesian Estimation
3.1 Bayesian Estimation
3.2 Point Estimators
3.3 Introduction to Recursive Bayesian Filtering of Probability Density Functions
3.4 Introduction to Recursive Bayesian Estimation of the State Mean and Covariance
3.5 Discussion of General Estimation Methods
References
Chapter 4: Case Studies: Preliminary Discussions
4.1 The Overall Simulation/Estimation/Evaluation Process
4.2 A Scenario Simulator for Tracking a Constant Velocity Target Through a DIFAR Buoy Field
4.3 DIFAR Buoy Signal Processing
4.4 The DIFAR Likelihood Function
References
Part II: The Gaussian Assumption: A Family of Kalman Filter Estimators
Chapter 5: The Gaussian Noise Case: Multidimensional Integration of Gaussian-Weighted Distributions
5.1 Summary of Important Results From Chapter 3
5.2 Derivation of the Kalman Filter Correction (Update) Equations Revisited
5.3 The General Bayesian Point Prediction Integrals for Gaussian Densities
References
Chapter 6: The Linear Class of Kalman Filters
6.1 Linear Dynamic Models
6.2 Linear Observation Models
6.3 The Linear Kalman Filter
6.4 Application of the LKF to DIFAR Buoy Bearing Estimation
References
Chapter 7: The Analytical Linearization Class of Kalman Filters: The Extended Kalman Filter
7.1 One-Dimensional Consideration
7.2 Multidimensional Consideration
7.3 An Alternate Derivation of the Multidimensional Covariance Prediction Equations
7.4 Application of the EKF to the DIFAR Ship Tracking Case Study
References
Chapter 8: The Sigma Point Class: The Finite Difference Kalman Filter
8.1 One-Dimensional Finite Difference Kalman Filter
8.2 Multidimensional Finite Difference Kalman Filters
8.3 An Alternate Derivation of the Multidimensional Finite Difference Covariance Prediction Equations
References
Chapter 9: The Sigma Point Class: The Unscented Kalman Filter
9.1 Introduction to Monomial Cubature Integration Rules
9.2 The Unscented Kalman Filter
9.3 Application of the UKF to the DIFAR Ship Tracking Case Study
References
Chapter 10: The Sigma Point Class: The Spherical Simplex Kalman Filter
10.1 One-Dimensional Spherical Simplex Sigma Points
10.2 Two-Dimensional Spherical Simplex Sigma Points
10.3 Higher Dimensional Spherical Simplex Sigma Points
10.4 The Spherical Simplex Kalman Filter
10.5 The Spherical Simplex Kalman Filter Process
10.6 Application of the SSKF to the DIFAR Ship Tracking Case Study
References
Chapter 11: The Sigma Point Class: The Gauss–Hermite Kalman Filter
11.1 One-Dimensional Gauss–Hermite Quadrature
11.2 One-Dimensional Gauss–Hermite Kalman Filter
11.3 Multidimensional Gauss–Hermite Kalman Filter
11.4 Sparse Grid Approximation for High Dimension/High Polynomial Order
11.5 Application of the GHKF to the DIFAR Ship Tracking Case Study
References
Chapter 12: The Monte Carlo Kalman Filter
12.1 The Monte Carlo Kalman Filter
References
Chapter 13: Summary of Gaussian Kalman Filters
13.1 Analytical Kalman Filters
13.2 Sigma Point Kalman Filters
13.3 A More Practical Approach to Utilizing the Family of Kalman Filters
References
Chapter 14: Performance Measures for the Family of Kalman Filters
14.1 Error Ellipses
14.2 Root Mean Squared Errors
14.3 Divergent Tracks
14.4 Cramer–Rao Lower Bound
14.5 Performance of Kalman Class DIFAR Track Estimators
References
Part III: Monte Carlo Methods
Chapter 15: Introduction to Monte Carlo Methods
15.1 Approximating a Density From a Set of Monte Carlo Samples
15.2 General Concepts Importance Sampling
15.3 Summary
References
Chapter 16: Sequential Importance Sampling Particle Filters
16.1 General Concept of Sequential Importance Sampling
16.2 Resampling and Regularization (Move) for SIS Particle Filters
16.3 The Bootstrap Particle Filter
16.4 The Optimal SIS Particle Filter
16.5 The SIS Auxiliary Particle Filter
16.6 Approximations to the SIS Auxiliary Particle Filter
16.7 Reducing the Computational Load Through Rao-Blackwellization
References
Chapter 17: The Generalized Monte Carlo Particle Filter
17.1 The Gaussian Particle Filter
17.2 The Combination Particle Filter
17.3 Performance Comparison of All DIFAR Tracking Filters
References
Part IV: Additional Case Studies
Chapter 18: A Spherical Constant Velocity Model for Target Tracking in Three Dimensions
18.1 Tracking a Target in Cartesian Coordinates
18.2 Tracking a Target in Spherical Coordinates
18.3 Implementation of Cartesian and Spherical Tracking Filters
18.4 Performance Comparison for Various Estimation Methods
18.5 Some Observations and Future Considerations
Appendix 18.A Three-Dimensional Constant Turn Rate Kinematics
Appendix 18.B Three-Dimensional Coordinate Transformations
References
Chapter 19: Tracking a Falling Rigid Body Using Photogrammetry
19.1 Introduction
19.2 The Process (Dynamic) Model for Rigid Body Motion
19.3 Components of the Observation Model
19.4 Estimation Methods
19.5 The Generation of Synthetic Data
19.6 Performance Comparison Analysis
Appendix 19.A quaternions, Axis-Angle Vectors, and Rotations
References
Chapter 20: Sensor Fusion using Photogrammetric and Inertial Measurements
20.1 Introduction
20.2 The Process (Dynamic) Model for Rigid Body Motion
20.3 The Sensor Fusion Observational Model
20.4 The Generation of Synthetic Data
20.5 Estimation Methods
20.6 Performance Comparison Analysis
20.7 Conclusions
20.8 Future Work
References
Index
Copyright 2012 by John Wiley & Sons, Inc. All rights reserved
Published by John Wiley & Sons, Inc., Hoboken, New Jersey
Published simultaneously in Canada
No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permission.
Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.
Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.
Library of Congress Cataloging-in-Publication Data:
Haug, Anton J., 1941–
Bayesian estimation and tracking: a practical guide / Anton J. Haug.
p. cm.
Includes bibliographical references and index.
ISBN 978-0-470-62170-7 (hardback)
1. Bayesian statistical decision theory. 2. Automatic
tracking–Mathematics. 3. Estimation theory. I. Title.
QA279.5.H38 2012
519.5′42–dc23
2011044308
Dedication
To my wife, who inspires us all to achieve at the highest limit of our abilities.
To my children, whose achievements make me component PUBLIC proud to be their father.
To my grandchildren, who will always keep me young.
And in memory of my parents, Rose and David, whose love was without bound.
Preface
This book presents a complete development of Bayesian estimation filters from first principles. We consider both linear and nonlinear dynamic systems driven by Gaussian or non-Gaussian noises. It is assumed that the dynamic systems are continuous because the observations related to those systems occur at discrete times only discrete filters are discussed. The primary goal is to present a comprehensive overview of most of the Bayesian estimation methods developed over the past 60 years in a unified approach that shows how each arises from the basic ideas underlying the Bayesian paradigms related to conditional densities.
The prerequisites for understanding the material presented in this book include a basic understanding of linear algebra, Bayesian probability theory, and numerical methods for finite differences and interpolation. Chapter 2 includes a review of all of these topics and is needed for an understanding of the remaining material in the book.
Many of the topics covered in this book grew out of a one semester course taught in the graduate mathematics department at the University of Maryland, College Park. The main goal of that course was to have the students develop their own Matlab® toolbox of target tracking methods. Several very specific tracking problems were presented to the students, and all homework problems consisted of coding each specific tracking (estimation) method into one or more Matlab® subroutines. In general, the subroutines the students developed were stand-alone and could be applied to any of a variety of tracking problems without much difficulty. Since the homework problems were dependent on our specific tracking problem (bearings-only tracking), I decided that this book would not contain any problem sets, allowing anyone using this book in a course to tailor their homework to a tracking problem of their choice. In addition, this book contains four fairly complicated case studies that contain enough material that any instructor can use one of them as the basis of their coding homework problems. The first case study is used as an example throughout most of Parts II and III of this book while Part IV consists of the remaining three case studies with a separate chapter devoted to each.
This book has two emphases: First, detailed derivations from first principles of each estimation method (tracking filters) are emphasized through numerous tables and figures, very detailed step-by-step instructions for each method that will make coding of the tracking filter simple and easy to understand are also emphasized.
It is shown that recursive Bayesian estimation can be developed as the solution to a series of conditional density-weighted integrals of a transition or transformation function. The transition function is one that transitions a dynamic state vector from one time step to the next while the transformation function is one that transforms a state vector into an observation vector. There are a variety of numerical methods for solving these integrals, and each leads to a different estimation method. Each chapter in Parts II and III of this book considers one or more numerical approximations to solving these integrals leading to the class of Kalman filter methods for Gaussian-weighted integrals in Part II and the class of particle filters for density-weighted integrals with unknown densities in Part III.
This book is an outgrowth of many years of research in the field and it is hoped that it will be a significant contribution to the Bayesian estimation and tracking literature. I also hope that it will open up the field in new directions based on the many comments made about how these methods can be enhanced and applied in new ways and to new problems.
Anton J. Haug
Acknowledgments
The author would like to acknowledge the many people who have contributed over the years to his knowledge base on Bayesian estimation and tracking through stimulating conversations.
Thanks are due to my colleagues from my previous position at MITRE, including G. Jacyna, D. Collella, and C. Christou.
Thanks are also due to my colleagues at the Johns Hopkins University Applied Physics Laboratory (JHUAPL), including L. Williams for her inspiring help on the case study in Chapter 18 and for general discussions on the material in the book; C. Davis for discussions on aspects of target tracking and for proofreading parts of the manuscript; and W. Martin for assessing the readability of the entire manuscript. Special thanks go to B. Beltran for the significant contribution of a Matlab® subroutine that computes the Gauss–Hermite sigma points and weights for any state vector dimension. I would also like to thank the JHUAPL Janney Publication Program committee for a one man-month work grant that allowed me to finish the manuscript in a timely fashion, and M. Whisnant for reviewing and commenting on the manuscript before granting a public release.
Finally, I would like to thank John Wiley & Sons for the opportunity to offer this book to the estimation and tracking community. I would also like to thank the anonymous reviewers of my initial book proposal submission for their many very helpful comments. I especially appreciate the role dependence for the of S. Steitz-Filler, the Mathematics and Statistics Editor at John Wiley & Sons, for her patience over the four years that it took to write this book. She and her colleagues provided valuable advice, support, and technical assistance in transforming my initial manuscript submission into a book that I can be proud of. xvii
List of Figures
1.1 Hierarchy of Bayesian Estimation Tracker Filters.
2.1 Example of Two-Dimensional Cartesian Grid of Axial Vector Points.
2.2 Example of Three-Dimensional Cartesian Grid of Axial Vector Points.
2.3 Two-Dimensional Points for Multidimensional Sterling's Approximation.
2.4 Simplex Figures in up to Three Dimensions.
2.5 One-Dimensional Gaussian pdf Example.
2.6 Example of a Two-Dimensional Gaussian PDF.
2.7 An Example of a One-Dimensional Gaussian CDF.
2.8 An Example of a Two-Dimensional Gaussian CDF.
2.9 Effects of an Affine Transformation on a Gaussian Probability Density Function.
3.1 Depiction of One Step in the Recursive Bayesian Posterior Density Estimation Procedure.
3.2 General Block Diagram for Recursive Point Estimation Process.
4.1 Methodology for Development and Evaluation of Tracking Algorithms.
4.2 DIFAR Buoy Geometry.
4.3 Ships Track with DIFAR Buoy Field and the Associated Bearings for Two Buoys.
4.4 DIFAR Sensor Signal Processing.
4.5 The Spread of Bearing Observations from a DIFAR Buoy from 100 Monte Carlo Runs for Four SNRs.
4.6 DIFAR Likelihood Density for a BT of 10 and SNR from −10 to +10 dB.
5.1 Block Diagram of the Process Flow for the Bayesian Estimator xix with Gaussian Probability Densities.
6.1 Block Diagram of the LKF Process for a Single DIFAR Buoy.
6.2 Comparison of the LKF Bearing Estimates with the True Bearing for One DIFAR Buoy.
7.1 The Geometry Used for Initialization.
7.2 A Comparison of the Estimated Track of a Ship Transiting the Buoy Field with the True Track.
7.3 Comparison of the EKF Tracker Outputs for Six SNRs.
10.1 Depiction of a Two-Dimensional Simplex with Four Vector Integration Points.
13.1 Graphical Presentation of the Linear Kalman Filter Process Flow.
13.2 A Graphical Representation of the EKF Process Flow.
13.3 Process Flow for General Sigma Point Kalman Filter.
14.1 Error Ellipse Rotation Angle.
14.2 An Example of Error Ellipses Overplotted on the DIFAR UKF Track Output.
14.3 Comparison of the RMS Errors for Five Different Track Estimation Algorithms with the Signal SNR at 20 dB.
14.4 Comparison of the RMS Errors for Five Different Track Estimation Algorithms with the Signal SNR at 15 dB.
14.5 Comparison of the RMS Errors for Five Different Track Estimation Algorithms with the Signal SNR at 10 dB.
14.6 Comparison of the RMS Errors for Five Different Track Estimation Algorithms with the Signal SNR at 5 dB.
14.7 Comparison of the RMS Errors for Five Different Track Estimation Algorithms with the Signal SNR at 0 dB.
15.1 Samples Drawn from a Two-Dimensional Gaussian Mixture Density.
15.2 Two-Dimensional Histogram Based on Samples from a Gaussian Mixture Distribution.
15.3 Gaussian Kernel Density Estimate for Sparse Sample Data.
15.4 Visualization of the Gaussian Kernel Estimate of a Density Generated from Random Samples Drawn for that Density.
15.5 Example of the Creation of .
16.1 Inverse of the Gaussian Cumulative Distribution.
16.2 Comparison of a Gaussian CDF with Its Inverse.
16.3 Principle of Resampling.
16.4 Target Track Generated from a DIFAR Buoy Field Using a Bootstrap Particle Filter.
16.5 A Comparison of Track Outputs at Six Different SNRs for the BPF tracker.
16.6 Track Estimation Results for the Auxiliary Particle Filter Applied to the DIFAR Tracking Case Study.
17.1 Process Flow Diagram for the Gaussian Particle Filter.
17.2 Combination Particle Filter that Uses an EKF as an Importance Density.
17.3 Combination Particle Filter that Uses a Sigma Point Kalman Filter as an Importance Density.
17.4 Monte Carlo Track Plots for the Sigma Point Gaussian Particle Filter for Six SNRs.
17.5 Comparison of the DIFAR Case-Study Root Mean Squared Position Errors for a Signal SNR of 20 dB.
17.6 Comparison of the DIFAR Case-Study Root Mean Squared Position Errors for a Signal SNR of 15 dB.
17.7 Comparison of the DIFAR Case-Study Root Mean Squared Position Errors for a Signal SNR of 10 dB.
18.1 A Simulated Radially Inbound Target Truth Track.
18.2 A Simulated Horizontally Looping Target Truth Track.
18.3 A Simulated Benchmark Target Truth Track.
18.4 Components of Both the Cartesian and Spherical Velocities for the Radially Inbound Trajectory.
18.5 Components of Both the Cartesian and Spherical Velocities for the Loop Trajectory.
18.6 Components of Both the Cartesian and Spherical Velocities for the Benchmark Trajectory.
18.7 Estimated Tracks for the Radially Inbound Target Trajectory for All Cartesian Tracking Methods.
18.8 Estimated Tracks for the Radially Inbound Target Trajectory for All Spherical Tracking Methods.
18.9 RMS Cartesian Position Errors of the Radially Inbound Target Trajectory for the Cartesian and Spherical Tracking Algorithms.
18.10 RMS Spherical Position Errors of the Radially Inbound Target Trajectory for the Cartesian and Spherical Tracking Algorithms.
18.11 Comparison of the Radially Inbound Trajectory RMS Cartesian Position Errors for Varying Values of q for Both Cartesian and Spherical EKF Filters.
18.12 Comparison of the Radially Inbound Trajectory RMS Spherical Position Errors for Varying Values of q for Both Cartesian and Spherical EKF Filters.
18.13 Estimated Tracks for the Looping Target Trajectory for All Cartesian Tracking Methods.
18.14 Estimated Tracks for the Looping Target Trajectory for All Spherical Tracking Methods.
18.15 RMS Cartesian and Spherical Position Errors of the Loop Target Trajectory for the Cartesian Tracking Algorithms.
18.16 RMS Cartesian and Spherical Position Errors of the Loop Target Trajectory for the Spherical Tracking Algorithms.
18.17 Comparison of RMS Cartesian Position Error Performance as a Function of q for the Loop Trajectory.
18.18 Comparison of RMS Spherical Position Error Performance as a Function of q for the Loop Trajectory.
18.19 Estimated Tracks for the Benchmark Target Trajectory for All Cartesian Tracking Methods.
18.20 Estimated Tracks for the Benchmark Target Trajectory for All Spherical Tracking Methods.
18.21 RMS Cartesian and Spherical Position Errors of the Benchmark Target Trajectory for the Cartesian Tracking Algorithms.
18.22 RMS Cartesian and Spherical Position Errors of the Benchmark Target Trajectory for the Spherical Tracking Algorithms.
18.23 Comparison of RMS Cartesian Position Error Performance as a Function of q for the Benchmark Trajectory.
18.24 Comparison of RMS Spherical Position Error Performance as a Function of q for the Benchmark Trajectory.
18.25 The East–North–Up Cartesian Coordinate System.
18.26 Depiction of a Constant Rate Turn in Both the Horizontal and Vertical Planes.
19.1 Photo of a US Navy F-18E Super Hornet Aircraft Preparing to Release Four Mk-62 Stores.
19.2 Close-up of a MK-62 Store Release as Viewed from a Camera Mounted Under the Aircrafts's Tail.
19.3 Projection of a Feature Point onto a Camera's Image Plane.
19.4 Translational and Rotational Position Using a Second-Order Model with a UKF Filter.
19.5 Translational and Rotational Velocity Using a Second-Order Model with a UKF Filter.
19.6 Translational and Rotational Acceleration Using a Second-Order Model with a UKF Filter.
19.7 Lateral Position Estimates from Multiple Solvers Using Identical Inputs.
19.8 RMS Errors for Position and Orientation of All Tracking Filters Using Synthetic Data.
19.9 RMS Errors for Position and Orientation Velocities of All Tracking Filters Using Synthetic Data.
19.10 RMS Errors for Position and Orientation Accelerations of All Tracking Filters Using Synthetic Data.
20.1 A Free-Standing IMU with Transmitting Antenna Attached, and An IMU Mounted in the Forward Fuse Well of a 500 Pound Store.
20.2 Translational and Rotational Position Using the Sensor Fusion Estimator.
20.3 Translational and Rotational Velocity Using the Sensor Fusion Estimator.
20.4 Translational and Rotational Acceleration Using the Sensor Fusion Estimator.
20.5 Lateral Position Estimates from Multiple Estimation Filters Using Identical Measurement Inputs.
20.6 RMS Errors for Position and Orientation of All Tracking Filters Using Synthetic Data.
20.7 RMS Errors for Position and Orientation Velocities of All Tracking Filters Using Synthetic Data.
20.8 RMS Errors for Position and Orientation Accelerations of All Tracking Filters Using Synthetic Data.
List of Tables
4.1 Procedure for Generating a Vector of DIFAR Noisy Bearing
6.1 Linear Kalman Filter Process.
7.1 One-Dimensional Extended Kalman Filter Process
7.2 Multidimensional Extended Kalman Filter Process.
8.1 One-Dimensional Finite Difference Kalman Filter Process.
8.2 Multidimensional Finite Difference Kalman Filter Process.
9.1 Multidimensional Unscented Kalman Filter Process.
9.2 Multidimensional Sigma Point Kalman Filter Process Applied to DIFAR Tracking.
10.1 Multidimensional Spherical Simplex Kalman Filter Process.
11.1 Multidimensional Gauss--Hermite Kalman Filter Process.
12.1 Multidimensional Spherical Simplex Kalman Filter Process.
13.1 Summary Data for Sigma Point Kalman Filters: Part 1—Form of c to Be Used for Sigma Points.
13.2 Summary Data for Sigma Point Kalman Filters: Part 2—Form of Sigma Point Weights.
13.3 Comparison of the Number of Integration Points Required for the Various Sigma Point Kalman Filters.
13.4 Hierarchy of Dynamic and Observation Models.
15.1 Common Univariate Kernel Functions of Order 2.
15.2 One-Dimensional Kernel Sample Generation.
15.3 Multidimensional Kernel Sample Generation.
16.1 General Sequential Importance Sampling Particle Filter.
16.2 Sequential Importance Sampling Particle Filter with Resampling.
16.3 Sequential Importance Sampling Particle Filter with Resampling and Regularization.
16.4 Bootstrap SIS Particle Filter with Resampling and Regularization.
16.5 Optimal SIS Particle Filter with Resampling and Regularization.
16.6 Auxiliary Particle Filter Process Flow.
16.7 Unscented Particle Filter with Resampling and Regularization.
18.1 Nominal Values for Initialization State Vector and Covariance Components.
18.2 Initial Characteristics of Three Simulated Scenarios.
19.1 Synthetic Data RMS Positional Error Summary.
19.2 Synthetic Data RMS Velocity Error Summary.
19.3 Synthetic Data RMS Acceleration Error Summary.
20.1 Synthetic Data RMS Positional Error Summary.
20.2 Synthetic Data RMS Velocity Error Summary.
20.3 Synthetic Data RMS Acceleration Error Summary.
Part I
Preliminaries
Chapter 1
Introduction
Estimation and tracking of dynamic systems has been the research focus of many a mathematician since the dawn of statistical mathematics. Many estimation methods have been developed over the past 50 years that allow statistical inference (estimation) for dynamic systems that are linear and Gaussian. In addition, at the cost of increased computational complexity, several methods have shown success in estimation when applied to nonlinear Gaussian systems. However, real-world dynamic systems, both linear and nonlinear, usually exhibit behavior that results in an excess of outliers, indicative of non-Gaussian behavior. The toolbox of standard Gaussian estimation methods have proven inadequate for these problems resulting in divergence of the estimation filters when applied to such real-world data.
With the advent of high-speed desktop computing, over the past decade the emphasis in mathematics has shifted to the study of dynamic systems that are non-Gaussian in nature. Much of the literature related to performing inference for non-Gaussian systems is highly mathematical in nature and is lacking in practical methodology that the average engineer can utilize without a lot of effort. In addition, several of the Gaussian methods related to estimation for nonlinear systems are presented ad hoc, without a cohesive derivation. Finally, there is a lack of continuity in the conceptual development to date between the Gaussian methods and their non-Gaussian counterparts.
In this book, we will endeavor to present a comprehensive study of the methods currently in use for statistical dynamic system estimation: linear and nonlinear, Gaussian and non-Gaussian. Using a Bayesian framework, we will present a conceptually cohesive roadmap that starts at first principles and leads directly to derivations of many of the Gaussian estimation methods currently in use. We will then extend these concepts into the non-Gaussian estimation realm, where the theory leads directly to working Monte Carlo methods for estimation. Although the Bayesian approach leads to the estimation of statistical densities, in most cases we will develop point estimation methods that can be obtained through the evaluation of density-weighted integrals. Thus, this book is all about numerical methods for evaluating density-weighted integrals for both Gaussian and non-Gaussian densities.
For each estimation method that we discuss and derive, we present both pseudo-code and graphic block diagram that can be used as tools in developing a software-coded tracking toolbox. As an aid in understanding the methods presented, we also discuss what is required to develop simulations for several very specific real-world problems. These case-study problems will be addressed in great detail, with track estimation results presented for each. Since it is hard to compare tracking methods ad hoc, we also present multiple methods to evaluate the relative performance of the various tracking filters.
Inference methods consist of estimating the current values for a set of parameters based on a set of observations or measurements. The estimation procedure can follow one of two models. The first model assumes that the parameters to be estimated, usually unobservable, are nonrandom and constant during the observation window but the observations are noisy and thus have random components. The second model assumes that the parameters are random variables that have a prior probability and the observations are noisy as well. When the first model is used for parameter estimation, the procedure is called non-Baysian or Fisher estimation [1]. Parameter estimation using the second model is called Bayesian estimation.
Bayesian estimation is conceptually very simple. It begins with some initial prior belief, such as the statement “See that ship. It is about 1000 yards from shore and is moving approximately Northeast at about 10 knots.” Notice that the initial belief statement includes an indication that our initial guess of the position and velocity of the ship are uncertain or random and based on some prior probability distribution. Based on one's initial belief, one can then make the prediction “Since the ship appears to be moving at a constant velocity, it will be over there in about 10 minutes.” This statement includes a mental model of the ship motion dynamics as well as some additional uncertainty. Suppose now, that one has a small portable radar on hand. The radar can be used to measure (observe) the line-of-sight range and range rate of the ship to within some measure of uncertainty. Given the right mathematical model, one that links the observations to the Cartesian coordinates of the ships position and velocity, a current radar measurement can be used to update the predicted ships state (position and velocity).
The above paragraph contains the essence of recursive Bayesian estimation:
This concept was first formalized in a paper by the Reverend Thomas Bayes, read to the Royal Statistical Society in 1763 by Richard Price several years after Bayes' death. An excellent review of the history and concepts associated with Bayesian statistical inference can be found in the paper by Stephen Brooks [2]. Brooks' paper also has some interesting examples that contrast the Bayesian method with the so-called “Frequentist” method for statistical inference. Since this book is devoted completely to Bayesian methods, we will not address the frequentist approach further and refer the interested reader to Brooks' paper.
As noted above, in this book we will present a cohesive derivation of a subset of modern tracking filters. Figure 1.1 shows the hierarchy of tracking filters that will be addressed in this book. Along the left-hand side are all the Gaussian tracking filters and along the right-hand side are all of the Monte Carlo non-Gaussian filters. This figure will be our guide as we progress through our discussions on each tracking filter. We will use it to locate where we are in our developments. We may occasionally take a side trip into other interesting concepts, such as a discussion of performance measures, but for the most part we will stick to a systematic development from top to bottom and left to right. By the time we reach the bottom right, you the reader will have a comprehensive understanding of the interrelatedness of all of the Bayesian tracking filters.
Figure 1.1 Hierarchy of Bayesian estimation tracker filters.
The objective of this book is to give the reader a firm understanding of Bayesian estimation methods and their interrelatedness. Starting with the first principles of Bayesian theory, we show how each tracking filter is derived from a slight modification to a previous filter. Such a development gives the reader a broader understanding of the hierarchy of Bayesian estimation and tracking. Following the discussions about each tracking filter, the filter is put into both pseudo-code and process flow block diagram form for ease in future recall and reference.
In his seminal book on filtering theory [3], originally published in 1970, Jazwinski stated that “The need for this book is twofold. First, although linear estimation theory is relatively well known, it is largely scattered in the journal literature and has not been collected in a single source. Second, available literature on the continuous nonlinear theory is quite esoteric and controversial, and thus inaccessible to engineers uninitiated in measure theory and stochastic differential equations.” A similar statement can be made about the current state of affairs in non-Gaussian Monte Carlo methods of estimation theory. Most of the published work is esoteric and inaccessible to engineers uninitiated in measure theory. The edited book of invited papers by Doucet et al. [4] is a prime example. This is an excellent book of invited papers, but is extremely esoteric in many of its stand-alone sections.
In this book, we will take Jazwinski's approach and remove much of the esoteric measure theoretic-based mathematics that makes understanding difficult for the average engineer. Hopefully, we have not replaced it with equally esoteric alternative mathematics.
This book is not an elementary book and is intended as a one semester graduate course or as a reference for anyone requiring or desiring a deeper understanding of estimation and tracking methods. Readers of this book should have a graduate level understanding of probability theory similar to that of the book by Papoulis [5]. The reader should also be familiar with matrix linear algebra and numerical methods including finite differences. In an attempt to reduce the steep prerequisite requirements for the reader, we have included several review sections in the next chapter on some of these mathematical topics. Even though some readers may want to skip these sections, the material presented is integral to an understanding of what is developed in Parts II and III of this book.
Part I consists of this introduction followed by a chapter that presents an overview of some mathematical principles required for an understanding of the estimation methods that follow. The third chapter introduces the concepts of recursive Bayesian estimation for a dynamic system that can be modeled as a potentially unobservable discrete Markov process. The observations (measurements) are related to the system states through an observation model and the observations are considered to be discrete. Continuous estimation methods are generally not considered in this book. The last chapter of Part I is devoted to preliminary development of a case study that will be used as working examples throughout the book, the problem of tracking a ship through a distributed field of directional frequency analysis and recording (DIFAR) sonobuoys. Included for this case study will be demonstrations of methods for development of complete simulations of the system dynamics along with the generation of noisy observations.
Part II is devoted to the development and application of estimation methods for the Gaussian noise case. In Chapter 5, the general Bayesian estimation methods developed in Chapter 3 are rewritten in terms of Gaussian probability densities. Methods for specific Gaussian Kalman filters are derived and codified in Chapters 6 through 12, including the linear Kalman filter (LKF), extended Kalman filter (EKF), finite difference Kalman filter (FDKF), unscented Kalman filter (UKF), spherical simplex Kalman filter (SSKF), Gauss–Hermite Kalman filter (GHKF), and the Monte Carlo Kalman filter (MCKF). With the exception of the MCKF, four of latter five tracking filters can be lumped into the general category of sigma point Kalman filters where deterministic vector integration points are used in the evaluation of the Gaussian-weighted integrals needed to estimate the mean and covariance matrix of the state vector. In the MCKF, the continuous Gaussian distribution is replaced by a sampled distribution reducing the estimation integrals to sums leaving the nonlinear functions intact. It will be shown in Chapter 13 that the latter five Kalman filter methods can be summarized into a single estimation methodology requiring just a change in the number and location of the vector points used and their associated weights.
An important aspect of estimation, usually ignored in most books on estimation, is the quantification of performance measures associated with the estimation methods. In Chapter 14 this topic is addressed, with sections on methods for computing and plotting error ellipses based on the estimated covariance matrices for use in real-system environments, as well as methods for computing and plotting root mean squared (RMS) errors and their Cramer–Rao lower bounds (CRLB) for use in Monte Carlo simulation environments. The final section of this chapter is devoted to application of these estimation methods to the DIFAR buoy tracking case study and includes a comparison of performance results as a function of decreasing input signal-to-noise ratio (SNR).
Estimation methods for use primarily with non-Gaussian probability densities is the topic addressed in Part III. For the MCKF introduced in Chapter 12 of Part II, the Gaussian density is approximated by a set of discrete Monte Carlo samples, reducing the mean and covariance estimation integrals to weighted sums, usually referred to as sample mean and sample covariance, respectively. For Gaussian densities, the sample weight is always 1/N, where N is the number of samples used. Non-Gaussian densities present two problems: first, it is usually very difficult to generate a set of Monte Carlo samples directly from the density. A second problem arises if the first or second moment does not exist for the density, with the Cauchy density as a prime example. To address the sampling problem, in Chapter 15 Monte Carlo methods are introduced and the concept of importance sampling developed that leads to estimation methods called particle filters, where the particles are the Monte Carlo sample points. Several problems arise when implementing these particle filters and potential enhancements are considered that correct for these problems. For importance sampling, weighting for each sample is calculated as the ratio of the non-Gaussian density to the importance density at the sample point. Under certain assumptions, the weights can be calculated recursively, giving rise to the sequential importance sampling (SIS) class of particle filters, the topic of Chapter 16. In Chapter 17, the case where the weights are recalculated every filter iteration step is addressed, leading to the Gaussian class of combination particle filters. Performance results for all of the particle filter track estimation methods applied to the DIFAR case study are presented as the conclusion of Chapter 17.
Several recently published books provide additional insight into the topics presented in this book. For Gaussian Kalman filters of Part II, books by Bar Shalom et al. [6] and Candy [7] are good companion books. For non-Gaussian filtering methods of Part III, books by Doucet et al. [4] and Ristic et al. [8] are excellent reference books.
It is important to the learning process that the reader be given concrete examples of application of estimation methods to a set of complex problems. This will be accomplished in this book through the use of simulations using MATLAB®. We present a set of four case studies that provide an increase in complexity from the first to the last. Each case study will include an outline of how to set up a simulation that models both the dynamics and observations of the system under study. We then show how to create a set of randomly generated observational data using a Monte Carlo methodology. This simulated observational data can then be used to exercise each tracking filter, producing sets of track data that can be compared across multiple track filters.
The first case study examines the problem of tracking a ship as it moves through a distributed field of DIFAR buoys. A DIFAR buoy uses the broadband noise signal radiated from the ship as in input and produces noisy observations of the bearing to the ship as an output. As we will show in Chapter 4, the probability density of the bearing estimates at the DIFAR buoy output is dependent on the SNR of the input signal. The density will be Gaussian for high SNR but will transition to a uniform distribution as the SNR falls. The purpose of this case study will be to examine what happens to the filter tracking performance for each track estimation method as the observation noise transitions from Gaussian to non-Gaussian.
This DIFAR case study will be the primary tool used throughout this book to illustrate each track estimation filter in turn. In Chapter 4, we show how to set up a simulation of the DIFAR buoy processing so as to produce simulated SNR dependent observation sets. Using these sets of bearing observations, in subsequent chapters we exercise each tracking algorithm to produce Monte Carlo sets of track estimates, allowing us to see the impact of the Gaussian to non-Gaussian observation noise transition on each tracking method.
In Part IV of this book, we present three additional case studies that illustrate the use of many of the tracking filters developed in Parts II and III. In Chapter 18, we address the important problem of tracking a maneuvering object in three dimension space. In this chapter, we introduce a new approach that uses a constant spherical velocity model vice the more traditional constant Cartesian velocity model. We show how this spherical model shows improved performance for tracking a maneuvering object using most of the Gaussian tracking filters.
The third case study, found in Chapter 19, considers the rather complex problem of tracking the dynamics of a falling bomb through the use of video frames of multiple tracking points on both the plane dropping the bomb and the bomb itself. This is a particular example of a complex process called photogrammetry, in which the geometric and dynamic properties of an object are inferred from successive photographic image frames. Thus, this case study consists of a very complex nonlinear multidimensional observational process as well as a nonlinear multidimensional dynamic model. In addition, both the dynamic and observational models are of high dimension, a particularly taxing problem for tracking filters. This will illustrate the effects of the so-called “curse” of dimensionality, showing that it is computationally impractical to utilize all tracking filters.
The final case study, the topic of Chapter 20, improves on the use of photogrammetric methods in estimation by showing how a separate estimator can be used for fusing data from additional sensors, such as multiple cameras, translational accelerometers, and angular rate gyroscopes. When used independently, each data source has its unique strengths and weaknesses. When several different sensors are used jointly in an estimator, the resulting solution is usually more accurate and reliable. The resulting analysis shows that estimator aided sensor fusion can recover meaningful results from flight tests that would otherwise have been considered failures.
References
1. Fisher R. Statistical Methods and Scientific Inference, Revised Edition. Macmillan Pub. Co.; 1973.
2. Brooks SP. Bayesian computation: a statistical revolution. Phil. Trans. R. Soc. Lond. A 2003;361:2681–2697.
3. Jazwinski AH. Stochastic Processes and Filtering Theory. Academic Press (1970), recently republished in paperback by Dover Publications; 2007.
4. Doucet A, de Freitas JFG, Gordon NJ, editors. Sequential Monte Carlo Methods in Practice. New York, NY: Springer-Verlag; 2001.
5. Papoulis A. Probability, Random Variables, and Stochastic Processes, 4 th ed. McGraw-Hill; 2002.
6. Bar Shalom Y, Li XR, Kirubarajan T. Estimation with Application to Tracking and Navigation: Theory, Algorithms and Software. Wiley; 2001.
7. Candy JV. Bayesian Signal Processing: Classical, Modern, and Particle Filtering Methods. Hoboken, NJ: Wiley; 2009.
8. Ristic B, Arulampalam S, Gordon N. Beyond the Kalman Filter: Particle Filters for Tracking Applications. Boston, MA: Artech House; 2004.
