Springer-Verlag London Ltd.978-1-4471-0101-7/1.pdf · Springer-Verlag London Ltd. ... Parallel...

19
Advanced Textbooks in Control and Signa1 Processing Springer-Verlag London Ltd.

Transcript of Springer-Verlag London Ltd.978-1-4471-0101-7/1.pdf · Springer-Verlag London Ltd. ... Parallel...

Advanced Textbooks in Control and Signa1 Processing

Springer-Verlag London Ltd.

Series EditorsProfessor Michael J. Grimble, Professor of Industrial Systems and DirectorProfessor Michael A. Johnson, Professor of Control Systems and Deputy DirectorIndustrial Control Centre, Department of Electronic and Electrical Engineering,UniversityofStrathclyde, Graham Hills Building, 50 George Street, Glasgow Gl lQE,U.K.

Other titles published in this series:

Genetic Algorithms: Concepts and DesignsK.F. Man, K.S. Tang and S. Kwong

Model Predictive ControlE. F. Camacho and C. Bordons

Introduction to Optimal EstimationE.W. Kamen and J.K. Su

Discrete-time Signal ProcessingD. Williamson

Neural Networks for Modelling and Control ofDynamic SystemsM. N0rgaard, O. Ravn, N.K. Poulsen and L.K. Hansen

Modelling and Control ofRobot Manipulators (2nd Edition)L. Sciavicco and B. Siciliano

Fault Detection and Diagnosis in Industrial SystemsL.H. Chiang, E.L. Russell and R.D. Braatz

Soft ComputingL. Fortuna, G. Rizzotto, M. Lavorgna, G. Nunnari, M.G. Xibilia andR.Caponetto

Statistical Signal ProcessingT. Chonavel- Translated by Janet Ormrod

Parallel Processing for Real-time Signal Processing and ControlM.a. Tokhi, M.A. Hossain and M.H. ShaheedPublication due January 2003

T. Soderstrom

Discrete-timeStochastic SystemsEstimation and Control

, Springer

Т. SOdеrstrбm, PhD Department ofSystems and Control, Information Technology, Uppsala University, РО Вох 337, SE 75105 Uppsala, Sweden

British Library Cataloguing in Publication Data Soderstrom, Torsten

Discrete-time stochastic systems : estimation and control. - 2nd ed. - (Advanced textbooks in control and signaJ processing) I.Stochastic systems 2.Discrete-time systems 3.Control theory I.Tit1e 003.8'5 ISBN 978-1-85233-649-3

Library of Congress Cataloging-in-Publication Data Soderstrom, Torsten.

Discrete-time stochastic systems : estimation and control/ Т. SOderstrom.--2пd ed. р. ст. -- (Advanced textbooks in control and signaJ processing, ISSN 1439-2232)

Includes bibliographicaJ references and index. ISBN 978-1-85233-649-3 ISBN 978-1-4471-0101-7 (eBook) DOI 10.1007/978-1-4471-0101-7 1. Automatic control. 2. Control theory. 3. Discrete-time systems. 1. Тit1e. 11. Series.

TJ213 .S517 2002 629.8--dc21 2002067022

Apart from anу fair deaJing for the ршрозез оС research or private study, or criticism or review, as permitted under the Copyright, Designs and Patents Act 1988, this publication тау оn1у Ье reproduced, stored or transmitted, in anу Соrш or Ьу апу means, with the prior permission in writing оС the publishers, or in the case оС reprographic reproduction in accordance with the terms oflicences issued Ьу the Copyright Licensing Agency. Enquiries concerning reproduction outside those terms should Ье sent to the publishers.

ISSN 1439-2232 ISBN 978-1-85233-649-3

http://www.springer.co.uk

© Springer-Verlag London2002 OriginaJly published Ьу Springer-Verlag London Limited in 2002

Тhe use оС registered naтes, trademarks etc. in this publication does not imply, even in the absence оС а specific statement, that such naтes are exempt from the relevant laws and regulations and therefore free for generaJ use.

Тhe publisher makes по representation, express or implied, with regard to the accuracy оС the information contained in this book and cannot accept anу legaJ responsibility or liability for anу errors or omissions that тау Ье made.

Typesetting: Electronic text files prepared Ьу author

69/3830-543210 Printed оп acid-free paper SPIN 10878170

To Marianne, Johanna, Andreas and David

Series Editors' Foreword

The topics of control engineering and signal processing continue to flourish anddevelop. In common with general scientific investigation, new ideas, concepts andinterpretations emerge quite spontaneously and these are then discussed, used,discarded or subsumed into the prevailing subject paradigm. Sometimes theseinnovative concepts coalesce into a new sub-discipline within the broad subjecttapestry of control and signal processing. This preliminary battle between old andnew usually takes place at conferences, through the Internet and in the journals ofthe discipline. After a little more maturity has been acquired by the new conceptsthen archival publication as a scientific or engineering monograph may occur.

A new concept in control and signal processing is known to have arrived whensufficient material has evolved for the topic to be taught as a specialised tutorialworkshop or as a course to undergraduate, graduates or industrial engineers.Advanced Textbooks in Control and Signal Processing are designed as a vehiclefor the systematic presentation of course material for both popular and innovativetopics in the discipline. It is hoped that prospective authors will welcome theopportunity to publish a structured and systematic presentation of some of thenewer emerging control and signal processing technologies.

The essentials for any advanced course on control include a thoroughunderstanding of state-space systems, both continuous- and discrete-time, theconcepts of stochastic systems and insights into their optimal control systems. Thistextbook on Discrete-time Stochastic Systems by Torsten Soderstrom provides aninvaluable introduction to these topics. It is a revised edition of an earlier PrenticeHall textbook which has benefited from a decade of classroom experience withProfessor Soderstrom's course at Uppsala University, Sweden.

Apart from being used to support a full course, the text also has someinteresting and useful features for the individual reader. The chapters areexceptionally well structured and can be used in a reference book fashion toinstruct or review different technical topics, for example, spectral factorization.Unlike many linear stochastic control textbooks, Professor SOderstrom has givenboth time-domain and polynomial methods, proofs and techniques for topics likelinear filtering and stochastic control systems. There are strong and fascinatinglinks between these two approaches and it is invaluable to have them presentedtogether in a single course textbook.

Every course lecturer likes to point their students to some topics which are alittle more challenging and which might lead on to an interest deepening into

viii Series Editors' Foreword

research. Professor SOderstrom has included a chapter on nonlinear filtering whichdemonstrates how linear methods can be extended to deal with the more difficultnonlinear system problems. Each chapter is also accompanied by a bibliographicallist of books and references to further reading for the interested reader.

The Advanced Textbook in Control and Signal Processing series seeks to createa set of books that are essential to a fundamental knowledge of the control andsignal processing area. Professor SOderstrom's text is a welcome complement toother books in the series like Kamen and Su's Introduction to Optimal Control(ISBN 1-85233-133-X) and Williamson's Discrete Time Signal Processing (ISBN1-85233-161-5) and we hope you will add Discrete-time Stochastic Systems to yourlibrary.

MJ. Grimble and M.A. JohnsonIndustrial Control CentreGlasgow, Scotland, U.K.

April,2002

Preface

This book has been written for graduate courses in stochastic dynamic sys­tems. It has emerged from various lecture notes (in Swedish and in English)that I have compiled and used in different courses at Uppsala University since1978.

The current text is a second edition of a book originally published byPrentice Hall International in 1994. All chapters have been revised. A numberof typographical and other errors have been corrected. Various new materialand results, including further problems, have been added.

The reader is assumed to be somewhat familiar with dynamic systemsand stochastic processes. In particular, it is assumed that the reader has aworking knowledge of the following areas (or is prepared to re-examine thisbackground elsewhere, should it be necessary):

• Fundamentals of linear discrete-time systems (such as state space mod­els and their relationships with transfer function operators and weightingfunctions) .

• Fundamentals of probability theory (including Gaussian distributed ran­dom vectors and conditional probabilities).

• Fundamentals of linear algebra and matrix calculations.• Fundamentals of stochastic processes (such as the concepts of covariance

function and spectral density, particularly in discrete time).

In compiling the manuscript, I have taken inspiration from various sources,including other books. Some parts reflect my own findings and derivations.The bibliographical notes at the end of each chapter give hints for furtherreading. These notes have intentionally been kept brief and there is no am­bition to supply comprehensive lists of references. The cited references docontain, in many cases, extensive publication lists. Many books deal withthe fundamentals of linear stochastic systems, using analysis of state spacemodels leading to the celebrated Kalman filter. Treatments using polyno­mial methods, as presented here, seem much less frequent in the literature.The same comment applies to extensions to nonlinear cases and higher-orderstatistics.

Most of the chapters contain problems to be treated as exercises by thereader. Many of these are of pen-and-pencil type, while others require nu­merical computation using computer programs. Some problems are straight-

x Preface

forward illustrations of the results in the text. Several problems, however, aredesigned to present extensions and to give further insight. To achieve knowl­edge and understanding of the estimation and control of stochastic systems,it is of great importance that the user gains experience by his or her ownwork. I hope that the problem sections will stimulate the reader to gain suchexperience. When using the text for graduate courses I have let the studentswork with some of the problems as an integral part of the course examination.

The structure of the book is as follows.Chapter 2 begins by giving a short review of probability theory. Some

useful properties of conditional probabilities and densities, and of Gaussianvariables are stated. One section is devoted to complex-valued Gaussian vari­ables.

Various model descriptions of stochastic dynamic systems are illustratedin Chapter 3. Some basic definitions are reviewed briefly. The importantconcepts of a Markov process and a state vector in a stochastic setting arediscussed. Properties of covariance functions, spectra, and higher-order mo­ments are also presented.

Chapter 4 covers the analysis of linear stochastic systems, with an em­phasis on second-order properties such as the propagation of the covariancefunction and the spectral density in a system. Spectral factorization, which isa very fundamental topic, is also treated. It is concerned with how to proceedfrom a specified spectrum to a filter description of a signal, so constructedthat optimal prediction and control can be derived easily from that filtermodel.

The topic of Chapter 5 is optimal estimation, where "optimal" refers tomean square optimality (i. e. the estimation error variance is minimized).Under certain conditions more general performance measures are also min­imized. The general theory is given, showing that the optimal estimate canoften be described by a conditional expectation.

The celebrated Kalman filter is derived in Chapter 6, using the resultsof the previous chapter. Optimal prediction and smoothing algorithms (as­suming that future data are available) are presented for a general linear statespace model.

Optimal prediction for processes given in filter or polynomial form is pre­sented in Chapter 7. The basic relations for Wiener filtering are also derived.A general (single input, single output) estimation problem is solved by ap­plying the Wiener filter technique and using a polynomial formalism. Wherea time-invariant input-output perspective is sufficient, this gives a conve­nient and interesting alternative to the state space methodology based onthe Kalman filter. The solution is naturally the same in the time-invariantcase using either approach.

Chapter 8 is devoted to an example in which a detailed treatment usingboth the state space and the polynomial approaches is examined. The optimalfilters, error variances, frequency characteristics, and so on, are examined,

Preface Xl

and it is shown how they are influenced by the measurement noise, etc. Thecalculations illustrate the close connections between the polynomial and thestate space approaches.

Many systems are inherently nonlinear. In Chapter 9, some nonlinearfilters and nonlinear effects such as quantization are dealt with. The greaterpart of the chapter deals with the extended Kalman filter and some variantsthereof.

The topic of Chapter 10 is the control of stochastic systems. It is firstshown that the way in which uncertainties are introduced can make a distinctdifference to the way in which the system should be optimally controlled.Next, the general optimal control problem is handled by using the principle ofdynamic programming. As this approach, while theoretically very interesting,requires extreme amounts of computation, some suboptimal schemes are alsodiscussed.

Finally, optimal control for linear systems that are perturbed by processand measurement noise is the topic of Chapter 11. This problem, knownas linear quadratic Gaussian control, has an elegant solution based on theso-called separation theorem, and this is also described. Also, the use ofpolynomial formalism to derive some simple optimal controllers and somecomparisons with state space formalism are included.

It is worth stressing here that the Glossary (pp. xiii-xv) contains itemsthat appear frequently in the book.

Several people have contributed directly or indirectly to these lecturenotes. As mentioned above, I have been teaching stochastic systems regu­larly in Uppsala and elsewhere for more than a decade. The feedback I havereceived from the many students over the years has been very valuable incompiling the manuscript. For the second edition I thank all those who havepointed out unclear points and errors in the first version. Special thanks go toFredrik Sandquist, who detected a tricky mistake, to Dr Erik G. Larsson whopointed out a flaw in a proof, and to Professor Torbj6rn Wigren who gavemany suggestions for improving the chapter on nonlinear estimation and alsoprovided some further exercises. Dr Egil Sviestins and Dr Niclas Bergmanhave contributed with valuable comments on nonlinear estimation.

Several students who recently used the text in a graduate course havepointed out various typos or unclear points. I am grateful to Emad Abd­ElRady, Richard Abrahamsson, Bharath Bhikkaji, Hong Cui, Mats Ekman,Kjartan Halvorsen, Bengt Johansson, Erik K. Larsson, Kaushik Mahata,Hans Norlander and Erik Ohlander for numerous valuable comments. Need­less to say, the responsibility for any remaining errors rests upon me.

Last, but not least, I also acknowledge the nice and smooth cooperationwith Springer-Verlag, and the persons who in some way or another havebeen involved in the production of this book: Professor Michael Grimble,Professor Michael Johnson, Ms. Catherine Drury, Mr Frank Holzwarth, MrOliver Jackson and Mr Peter Lewis.

Xli Preface

To all the above people I express my sincere thanks.In control, feedback is an essential concept. This is also true for writing

books. I welcome comments by the readers, and can be reached by the emailaddress [email protected].

Torsten SoderstromDepartment of Systems and Control

Uppsala UniversityPOBox 337, SE 751 05 Uppsala

SwedenSpring 2002

Glossary

Notation

CCNEe(t)G(q)H(q)hhkIIniKKrKp

L£:(min)N(m,P)

nPPP(t)

Pp(xly)qq-l

Rr(t, s)r(-T)ryu(T)S(t)AT

complex planecomplex Gaussian distributionexpectation operatorwhite noise (a sequence of independent random variables)transfer function operatortransfer function operator, noise shaping filtersampling intervalweighting function coefficient, H(q) = L:~o hkq-kidentity matrix(nln) identity matriximaginary unitoptimal predictor gain (in Kalman filter)optimal filter gain (in Kalman filter)optimal predictor gain (in Kalman filter)optimal feedback gainLaplace transformmatrix is m by nnormal (Gaussian) distribution of mean value m and

covariance matrix Pmodel orderprobabilitytransition matrix for a Markov chaincovariance matrix of state or state prediction errordifferentiation operatorconditional probability of x given yshift operator, qx(t) = x(t + 1)backward shift operator, q-1x(t) = x(t - 1)real axiscovariance functioncovariance functioncross-covariance function between y(t) and u(t)solution to Riccati equation for optimal controltranspose of the matrix A

xiv Glossary

ttru(t)Vv(t)x(t)yt

y(t)y(tjt-1)y( t)Z')'(x; m, P)

L18t ,s

8(m,n)8(7)c(t)A.\2

P<P<Pel (z)<Pol (z)<p(z)<p(w)<Pu (w)~yu(W)

<Pcp(w)7

W

time variable (integer-valued for discrete-time models)trace (of a matrix)input signal (possibly vector-valued)loss function, performance indexwhite noise (a sequence of independent random variables)state vectorall available output measurements at time t,

yt = {y(t),y(t -1) ...}output signal (possibly vector-valued)optimal one-step predictor of y(t)prediction error, innovationz transformpdf of normal (Gaussian) distribution of mean value mand covariance matrix PdifferenceKronecker delta (=1 if s = t, else = 0)extended Kronecker delta (=1 if m = n = 0, else = 0)Dirac functionprediction errorcovariance matrix of innovationsvariance of white noisecontrol weightingEuler matrixclosed loop characteristic polynomialopen loop characteristic polynomialspectrumspectral densityspectral density of the signal u(t)cross-spectral density between the signals y(t) and u(t)positive real part of spectrumcharacteristic functiontime lag (in covariance function)angular frequency

Abbreviations

ARAR(n)AREARMAARMA(n,m)

ARMAXcovdeg

autoregressiveAR of order nalgebraic Riccati equationautoregressive moving averageARMA where AR and MA parts have orders nand m

respectivelyautoregressive moving average with exogenous inputcovariance matrixdegree

FIRGMVGPCiidIIRIMMLLMSLQGLTRMAMA(n)MLpdfSISatr

finite impulse responsegeneralized minimum variancegeneralized predictive controlindependent and identically distributedinfinite impulse responseinteracting multi modelslinear least mean squarelinear quadratic Gaussianloop transfer recoverymoving averageMA of order nmaximum likelihoodprobability density functionsingle input, single outputtrace (of a matrix)

Glossary xv

Notational Conventions

H- 1 (q)xT(t)A-T

A2BA>B[;,

[H(q)t 1

[x(t)jT[A-1]Tthe difference matrix A - B nonnegative definitethe difference matrix A - B positive definite

defined asdistributed ascausal part of a transfer functioncomplex conjugate of wtranspose of wconjugate transpose of w, w* =wT

(w-1)* = (W*)-l

Conventions for Polynomials

A(z) = zn + alzn- 1 + ... + anA*(z) = [A(z)]* = z*n + aiz*(n-l) + ... + a~A*(z-*) = z-n + aiz-(n-l) + + a~A(Z-l) = z-n + alz-(n-l) + + an

If A(z) has real-valued coefficients

A*(z-*) = A(Z-l)

On the unit circle

z* = Z-l, A*(z) = A*(z-*)

Contents

1. Introduction.............................................. 11.1 What is a Stochastic System? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1Bibliography 5

2. Some Probability Theory. . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . 72.1 Introduction........................................... 72.2 Random Variables and Distributions. . . . . . . . . . . . . . . . . . . . . . 7

2.2.1 Basic Concepts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.2.2 Gaussian Distributions. . . . . . . . . . . . . . . . . . . . . . . . . . .. 112.2.3 Correlation and Dependence. . . . . . . . . . . . . . . . . . . . . .. 12

2.3 Conditional Distributions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 122.4 The Conditional Mean for Gaussian Variables. . . . . . . . . . . . .. 142.5 Complex-Valued Gaussian Variables 17

2.5.1 The Scalar Case. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 172.5.2 The Multivariable Case 192.5.3 The Rayleigh Distribution. . . . . . . . . . . . . . . . . . . . . . . .. 24

Exercises 26Bibliography. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 27

3. Models................................................... 293.1 Introduction........................................... 293.2 Stochastic Processes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 303.3 Markov Processes and the Concept of State. . . . . . . . . . . . . . .. 333.4 Covariance Function and Spectrum . . . . . . . . . . . . . . . . . . . . . .. 363.5 Bispectrum............................................ 463.A Appendix. Linear Complex-Valued Signals and Systems ..... 48

3.A.1 Complex-Valued Model of a Narrow-Band Signal.. . .. 483.A.2 Linear Complex-Valued Systems. . . . . . . . . . . . . . . . . . .. 49

3.B Appendix. Markov Chains. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 51Exercises 55Bibliography 58

XVlll Contents

4. Analysis.................................................. 594.1 Introduction........................................... 594.2 Linear Filtering. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 59

4.2.1 Transfer Function Models. . . . . . . . . . . . . . . . . . . . . . . .. 594.2.2 State Space Models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 614.2.3 Yule-Walker Equations 67

4.3 Spectral Factorization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 714.3.1 Transfer Function Models 714.3.2 State Space Models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 744.3.3 An Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 78

4.4 Continuous-time Models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 804.4.1 Covariance Function and Spectra. . . . . . . . . . . . . . . . . .. 804.4.2 Spectral Factorization. . . . . . . . . . . . . . . . . . . . . . . . . . . .. 814.4.3 White Noise. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 824.4.4 Wiener Processes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 834.4.5 State Space Models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 84

4.5 Sampling Stochastic Models 864.5.1 Introduction..................................... 864.5.2 State Space Models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 874.5.3 Aliasing......................................... 88

4.6 The Positive Real Part of the Spectrum . . . . . . . . . . . . . . . . . .. 904.6.1 ARMA Processes '" . . . .. 904.6.2 State Space Models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 954.6.3 Continuous-time Processes. . . . . . . . . . . . . . . . . . . . . . . .. 98

4.7 Effect of Linear Filtering on the Bispectrum 994.8 Algorithms for Covariance Calculations and Sampling 103

4.8.1 ARMA Covariance Function 1034.8.2 ARMA Cross-Covariance Function 1054.8.3 Continuous-Time Covariance Function 1074.8.4 Sampling 1084.8.5 Solving the Lyapunov Equation 110

4.A Appendix. Auxiliary Lemmas 111Exercises 114Bibliography 121

5. Optimal Estimation 1235.1 Introduction 1235.2 The Conditional Mean 1235.3 The Linear Least Mean Square Estimate 1265.4 Propagation of the Conditional Probability Density Function. 1285.5 Relation to Maximum Likelihood Estimation 1305.A Appendix. A Lemma for Optimality of the Conditional Mean 133Exercises 134Bibliography 135

Contents XIX

6. Optimal State Estimation for Linear Systems 1376.1 Introduction 1376.2 The Linear Least Mean Square One-Step Prediction and Fil-

ter Estimates 1386.3 The Conditional Mean 1456.4 Optimal Filtering and Prediction 1476.5 Smoothing 148

6.5.1 Fixed Point Smoothing 1496.5.2 Fixed Lag Smoothing 150

6.6 Maximum a posteriori Estimates . . . . . . . . . . . . . . . . . .. 1536.7 The Stationary Case 1556.8 Algorithms for Solving the Algebraic Riccati Equation 159

6.8.1 Introduction 1596.8.2 An Algorithm Based on the Euler Matrix 161

6.A Appendix. Proofs 1656.A.l The Matrix Inversion Lemma 1656.A.2 Proof of Theorem 6.1 1666.A.3 Two Determinant Results 171

Exercises 171Bibliography 182

7. Optimal Estimation for Linear Systems by Polynomial Meth-ods 1857.1 Introduction 1857.2 Optimal Prediction 185

7.2.1 Introduction 1857.2.2 Optimal Prediction of ARMA Processes 1877.2.3 A General Case 1917.2.4 Prediction of Nonstationary Processes 193

7.3 Wiener Filters 1947.3.1 Statement of the Problem 1947.3.2 The Unrealizable Wiener Filter 1967.3.3 The Realizable Wiener Filter 1977.3.4 Illustration 1997.3.5 Algorithmic Aspects 2007.3.6 The Causal Part of a Filter, Partial Fraction Decom-

position and a Diophantine Equation 2027.4 Minimum Variance Filters 205

7.4.1 Introduction 2057.4.2 Solution 2067.4.3 The Estimation Error 2077.4.4 Extensions 2097.4.5 Illustrations 211

7.5 Robustness Against Modelling Errors 215Exercises 218

xx Contents

Bibliography 220

8. Illustration of Optimal Linear Estimation 2238.1 Introduction 2238.2 Spectral Factorization 2238.3 Optimal Prediction 2258.4 Optimal Filtering. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2278.5 Optimal Smoothing 2288.6 Estimation Error Variance 2328.7 Weighting Pattern 2348.8 Frequency Characteristics 235Exercises 242

9. Nonlinear Filtering 2459.1 Introduction 2459.2 Extended Kalman Filters 245

9.2.1 The Basic Algorithm 2459.2.2 An Iterated Extended Kalman Filter 2479.2.3 A Second-order Extended Kalman Filter 2489.2.4 An Example 250

9.3 Gaussian Sum Estimators 2549.4 The Multiple Model Approach 257

9.4.1 Introduction 2579.4.2 Fixed Models 2579.4.3 Switching Models 2599.4.4 Interacting Multiple Models Algorithm 260

9.5 Monte Carlo Methods for Propagating the Conditional Prob-ability Density Functions 265

9.6 Quantized Measurements 2699.7 Median Filters 270

9.7.1 Introduction 2709.7.2 Step Response 2719.7.3 Response to Sinusoids 2729.7.4 Effect on Noise 273

9.A Appendix. Auxiliary results 2779.A.1 Analysis of the Sheppard Correction 2779.A.2 Some Probability Density Functions 280

Exercises 281Bibliography 294

10. Introduction to Optimal Stochastic Control 29710.1 Introduction 29710.2 Some Simple Examples 297

10.2.1 Introduction 29710.2.2 Deterministic System 297

Contents XXI

10 2 3 Random Time Constant 29810.2.4 Noisy Observations 29910 2 5 Process Noise 30010.2.6 Unknown Time Constants and Measurement Noise ... 30010 2 7 Unknown Gain 301

10.3 Mathematical Preliminaries 30310.4 Dynamic Programming 304

10.4.1 Deterministic Systems 30510.4.2 Stochastic Systems 306

10.5 Some Stochastic Controllers 31110.5.1 Dual Control 31310.5.2 Certainty Equivalence Control 31310.5.3 Cautious Control 314

Exercises 316Bibliography 317

11. Linear Quadratic Gaussian Control 31911.1 Introduction 31911.2 The Optimal Controllers 320

11.2.1 Optimal Control of Deterministic Systems 32011.2.2 Optimal Control with Complete State Information 32311.2.3 Optimal Control with Incomplete State Information .. 324

11.3 Duality Between Estimation and Control 32611.4 Closed Loop System Properties 328

11 4 1 Representations of the Regulator 32811.4.2 Representations of the Closed Loop System 32911.4.3 The Closed Loop Poles 331

11.5 Linear Quadratic Gaussian Design by Polynomial Methods .. 33211.5.1 Problem Formulation 33211.5.2 Minimum Variance Control 33311.5.3 The General Case 337

11.6 Controller Design by Linear Quadratic Gaussian Theory 34411.6.1 Introduction 34411.6.2 Choice of Observer Poles 351

11.A Appendix. Derivation of the Optimal Linear Quadratic Gaus-sian Feedback and the Riccati Equation from the BellmanEquation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 360

Exercises 362Bibliography 364

Answers to Selected Exercises 367

Index 373