Princeton MAE 345 - State and Parameter Estimation

Unformatted text preview:

State and Parameter Estimation! Robert Stengel! Robotics and Intelligent Systems MAE 345, !Princeton University, 2013"• Compute least-squares estimates of a constant vector"– Unweighted and weighted batch processing of noisy data"– Recursive processing to incorporate new data"• Estimate the state of an uncertain linear dynamic system with incomplete, noisy measurements"– Discrete-time Kalman filter"– Continuous-time Kalman-Bucy filter"• Estimate the parameters of linear system model"– Weighted least-squares estimation"– Extended Kalman-Bucy Filter"Copyright 2013 by Robert Stengel. All rights reserved. For educational use only.!http://www.princeton.edu/~stengel/MAE345.html!Learning Objectives!Estimate Constant Vector"by Inverse Transformation"• Given "– Measurements, y, of a constant vector, x"• Estimate x"• Assume that output, y, is a perfect measurement and H is invertible"y = H x– y: (n x 1) output vector"– H: (n x n) output matrix"– x : (n x 1) vector to be estimated"• Estimate is based on inverse transformation"ˆx = H−1yImperfect Measurement of a Constant Vector"• Given "– Noisy measurements, z, of a constant vector, x"• Effects of error can be reduced if measurement is redundant"• Noise-free output, y"y = H x• Measurement of output with error, z"z = y + n = H x + n• z: (k x 1) measurement vector"• n : (k x 1) error vector"• y: (k x 1) output vector"• H: (k x n) output matrix, k > n"• x : (n x 1) vector to be estimated"Least-Squares Estimate of a Constant Vector"• Squared measurement error = cost function, J"• What is the control parameter?"J =12εTε =12z − Hˆx( )Tz − Hˆx( )=12zTz −ˆxTHTz − zTHˆx +ˆxTHTHˆx( )Quadratic norm!ˆxThe estimate of x!dimˆx( )= n × 1( )• Measurement-error residual"ε = z − Hˆx = z −ˆy dim ε( )= k ×1( )Least-Squares Estimate of a Constant Vector"• Necessary condition for a minimum"∂J∂ˆx= 0 =120 − HTz( )T− zTH + HTHˆx( )T+ˆxTHTH⎡⎣⎤⎦• The 2nd and 4th terms are transposes of the 3rd and 5th terms"J =12zTz −ˆxTHTz − zTHˆx +ˆxTHTHˆx( )Least-Squares Estimate of a Constant Vector"The derivative of a scalar, J, with respect to a vector, x, (i.e., the gradient) is defined to be a row vector; thus,"∂J∂ˆx=∂J∂ˆx1∂J∂ˆx2...∂J∂ˆxn⎡⎣⎢⎢⎤⎦⎥⎥=120 − HTz( )T− zTH + HTHˆx( )T+ˆxTHTH⎡⎣⎤⎦= −zTH +ˆxTHTH⎡⎣⎤⎦= 0Optimal Estimate of x"Rearranging"ˆxTHTH − zTH⎡⎣⎤⎦= 0ˆxTHTH = zTHThen, the optimal estimate is""ˆxTHTH( )HTH( )−1=ˆxTI( )=ˆxT= zTH HTH( )−1(row)(HTH)-1HT is called the pseudoinverse of H"ˆx = HTH( )−1HTz (column)Is the Least-Squares Solution a Minimum or a Maximum?"Gradient"A minimum "Hessian matrix"∂2J∂ˆx2= HTH > 0, dim = n × n( )∂J∂ˆx=∂J∂ˆx1∂J∂ˆx2...∂J∂ˆxn⎡⎣⎢⎢⎤⎦⎥⎥= −zTH +ˆxTHTH⎡⎣⎤⎦= 0Estimation of a Scalar Constant: Average Weight of the Jelly Beans"Measurements are equally uncertain"Optimal estimate"Express measurements as"zi= x + ni,i = 1 to kz = Hx + nH =11...1⎡⎣⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥Output matrix"ˆx = HTH( )−1HTzEstimation of a Scalar Constant: Average Weight of the Jelly Beans"Optimal estimate"ˆx = HTH( )−1HTz1 × 1( )= 1 × k( )k × 1( )⎡⎣⎤⎦−11 × k( )k × 1( )ˆx = 1 1 ... 1⎡⎣⎤⎦11...1⎡⎣⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥⎛⎝⎜⎜⎜⎜⎞⎠⎟⎟⎟⎟−11 1 ... 1⎡⎣⎤⎦z1z2...zk⎡⎣⎢⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥⎥ˆx = k( )−1z1+ z2+ ... + zk( )ˆx =1kzii =1k∑sample mean value[ ]...is the average"Measurements of Differing Quality"Original cost function, J, and optimal estimate of x"Suppose some elements of the measurement, z, are more uncertain than others""J =12εTε =12z − Hˆx( )Tz − Hˆx( )=12zTz −ˆxTHTz − zTHˆx +ˆxTHTHˆx( )ˆx = HTH( )−1HTzz = Hx + nGive the more uncertain measurements less weight in arriving at the minimum-cost estimate"Error-Weighted Cost Function"Error-weighting matrix, R-1"J =12εTR−1ε =12z − Hˆx( )TR−1z − Hˆx( )=12zTR−1z −ˆxTHTR−1z − zTR−1Hˆx +ˆxTHTR−1Hˆx( )Weighted cost function, J, reduces significance of poorer measurements"Measurement uncertainty matrix, R (large is worse)"R =(large error) 0 ... 00 (small error) ... 0... ... ... ...0 0 ... (medium error)⎡⎣⎢⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥⎥R−1=(low weight) 0 ... 00 (high weight) ... 0... ... ... ...0 0 ... (medium weight)⎡⎣⎢⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥⎥Weighted Least-Squares Estimate of a Constant Vector"Weighted cost function, J"Necessary condition for a minimum"∂J∂ˆx= 0 =120 − HTR−1z( )T− zTR−1H + HTR−1Hˆx( )T+ˆxTHTR−1H⎡⎣⎤⎦J =12εTR−1ε =12z − Hˆx( )TR−1z − Hˆx( )=12zTR−1z −ˆxTHTR−1z − zTR−1Hˆx +ˆxTHTR−1Hˆx( )Weighted Least-Squares Estimate of a Constant Vector"Necessary condition for a minimum"∂J∂ˆx= 0 =120 − HTR−1z( )T− zTR−1H + HTR−1Hˆx( )T+ˆxTHTR−1H⎡⎣⎤⎦ˆxTHTR−1H − zTR−1H⎡⎣⎤⎦= 0ˆxTHTR−1H = zTR−1HThe weighted optimal estimate is" ˆ x = HTR−1H( )−1HTR−1zOptimal estimate of average jelly bean weight"ˆx = 1 1 ... 1⎡⎣⎤⎦a110 ... 00 a22... 0... ... ... ...0 0 ... akk⎡⎣⎢⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥⎥11...1⎡⎣⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥⎛⎝⎜⎜⎜⎜⎜⎞⎠⎟⎟⎟⎟⎟−11 1 ... 1⎡⎣⎤⎦a110 ... 00 a22... 0... ... ... ...0 0 ... akk⎡⎣⎢⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥⎥z1z2...zk⎡⎣⎢⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥⎥ ˆ x =aiii=1k∑ziaiii=1k∑R−1= A =1 /σn120 ... 00 1 /σn22... 0... ... ... ...0 0 ... 1 /σnk2⎡⎣⎢⎢⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥⎥⎥=a110 ... 00 a22... 0... ... ... ...0 0 ... akk⎡⎣⎢⎢⎢⎢⎢⎤⎦⎥⎥⎥⎥⎥ˆx = HTR−1H( )−1HTR−1zWeighted Estimate of Average Jelly Bean Weight"Error-weighting matrix based on standard deviations"Recursive Least-Squares Estimation"! Prior unweighted and weighted least-squares estimators use batch-processing approach"! All information is gathered prior to processing"! All information is processed at once"! Recursive approach"! Optimal estimate has been made from prior measurement set"! New measurement set is obtained"! Optimal estimate is improved by incremental change (or correction) to the prior optimal estimate"Addition of New Measurement"Initial


View Full Document

Princeton MAE 345 - State and Parameter Estimation

Download State and Parameter Estimation
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view State and Parameter Estimation and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view State and Parameter Estimation 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?