EE363 Winter 2008-09Lecture 9The Extended Kalman filter• Nonlinear filtering• Extended Kalman filter• Linearization and random variables9–1Nonlinear filtering• nonlinear Markov model:xt+1= f(xt, wt), yt= g(xt, vt)– f is (possibly nonlinear) dynamics function– g is (possibly nonlinear) measurement or output function– w0, w1, . . . , v0, v1, . . . are independent– even if w, v Gaussian, x and y need not be• nonlinear filtering problem: find, e.g.,ˆxt|t−1= E(xt|y0, . . . , yt−1), ˆxt|t= E(xt|y0, . . . , yt)• general nonlinear filtering solution involves a PDE, and is not practicalThe Extended Kalman filter 9–2Extended Kalman filter• extended Kalman filter (EKF) is heuristic for nonlinear filtering problem• often works well (when tuned properly), but sometimes not• widely used in practice• based on– linearizing dynamics and output functions at current estimate– propagating an approximation of the conditional expectation andcovarianceThe Extended Kalman filter 9–3Linearization and random variables• consider φ : Rn→ Rm• suppose E x = ¯x, E(x − ¯x)(x − ¯x)T= Σx, and y = φ(x)• if Σxis small, φ is not too nonlinear,y ≈ ˜y = φ(¯x) + Dφ(¯x)(x − ¯x)• gives approximation for mean and covariance of nonlinear function ofrandom variable:¯y ≈ φ(¯x), Σy≈ Dφ(¯x)ΣxDφ(¯x)T• if Σxis not small compared to ‘curvature’ of φ, these estimates are poorThe Extended Kalman filter 9–4• a good estimate can be found by Monte Carlo simulation:¯y ≈ ¯ymc=1NNXi=1φ(x(i))Σy≈1NNXi=1φ(x(i)) − ¯ymcφ(x(i)) − ¯ymcTwhere x(1), . . . , x(N)are samples from the distribution of x, and N islarge• another method: use Monte Carlo formulas, with a small number ofnonrandom samples chosen as ‘typical’, e.g., the 90% confidenceellipsoid semi-axis endpointsx(i)= ¯x ± βvi, Σx= V ΛVTThe Extended Kalman filter 9–5Examplex ∼ N(0, 1), y = exp(x)(for this case we can compute mean and variance of y exactly)¯y σyexact values e1/2= 1.649√e2− e = 2.161linearization1.000 1.000Monte Carlo (N = 10)1.385 1.068Monte Carlo (N = 100)1.430 1.776Sigma points (x = ¯x, ¯x ± 1.5σx)1.902 2.268The Extended Kalman filter 9–6Extended Kalman filter• initialization: ˆx0|−1= ¯x0, Σ(0| − 1) = Σ0• measurement update– linearize output f unction at x = ˆxt|t−1:C =∂g∂x(ˆxt|t−1, 0)V =∂g∂v(ˆxt|t−1, 0)Σv∂g∂v(ˆxt|t−1, 0)T– measurement update based on linearizationˆxt|t= ˆxt|t−1+ Σt|t−1CTCΣt|t−1CT+ V−1. . .. . . (yt− g(ˆxt|t−1, 0))Σt|t= Σt|t−1− Σt|t−1CTCΣt|t−1CT+ V−1CΣt|t−1The Extended Kalman filter 9–7• time update– linearize dynamics f unction at x = ˆxt|t:A =∂f∂x(ˆxt|t, 0)W =∂f∂w(ˆxt|t, 0)Σw∂f∂w(ˆxt|t, 0)T– time update based on linearizationˆxt+1|t= f(ˆxt|t, 0), Σt+1|t= AΣt|tAT+ W• replacing linearization with Monte Carlo yields particle filter• replacing linearization with sigma-point estimates yields unscentedKalman filter (UKF)The Extended Kalman filter 9–8Example• pt, ut∈ R2are position and velocity of vehicle, with (p0, u0) ∼ N(0, I)• vehicle dynamics:pt+1= pt+ 0.1ut, ut+1=0.85 0.15−0.1 0.85ut+ wtwtare IID N(0, I)• measurements: noisy measurements of distance to 9 points pi∈ R2(yt)i= kpt− pik + (vt)i, i = 1, . . . , 9,(vt)iare IID N(0, 0.32)The Extended Kalman filter 9–9EKF results• EKF initialized with ˆx0|−1= 0, Σ(0| − 1) = I, where x = (p, u)• pishown as stars; ptas dotted curve; ˆpt|tas solid curve−3 −2 −1 0 1 2 3 4−10123456789p1p2The Extended Kalman filter 9–10Current position estimation errorkˆpt|t− ptk versus t0 20 40 60 80 100 120 140 160 180 20000.511.522.533.544.5tThe Extended Kalman filter 9–11Current position estimation predicted error(Σ(t|t)11+ Σ(t|t)22)1/2versus t0 100 200 300 400 500 600 700 800 900 10000.20.40.60.811.21.41.6tThe Extended Kalman filter
View Full Document