**Unformatted text preview:**

Kalman Filters Switching Kalman FilterThe Kalman FilterMultivariate GaussianConditioning a GaussianGaussian is a “Linear Model”Slide 9Conditional Linear Gaussian (CLG) – general caseUnderstanding a linear Gaussian – the 2d caseTracking with a Gaussian 1Tracking with Gaussians 2 – Making observationsOperations in Kalman filterExponential family representation of Gaussian: Canonical FormCanonical formConditioning in canonical formSlide 18Prediction & roll-up in canonical formAnnouncementsWhat if observations are not CLG?Linearization: incorporating non-linear evidenceLinearization as integrationLinearization as numerical integrationOperations in non-linear Kalman filterWhat you need to know about Kalman FiltersWhat if the person chooses different motion models?The moonwalkSlide 29Switching Kalman filterInference in switching KF – one stepMulti-step inferenceVisualizing growth in number of GaussiansComputational complexity of inference in switching Kalman filtersBounding number of GaussiansCollapsing Gaussians – Single Gaussian from a mixtureCollapsing mixture of Gaussians into smaller mixture of GaussiansOperations in non-linear switching Kalman filterAssumed density filteringWhen non-linear KF is not good enoughReparameterized KF for SLATWhen a single Gaussian ain’t good enoughApproximating non-linear KF with mixture of GaussiansWhat you need to know1Kalman FiltersSwitching Kalman FilterGraphical Models – 10708Carlos GuestrinCarnegie Mellon UniversityNovember 20th, 2006Readings:K&F: 4.5, 12.2, 12.3, 12.43 The Kalman FilterAn HMM with Gaussian distributionsHas been around for at least 50 yearsPossibly the most used graphical model everIt’s whatdoes your cruise controltracks missilescontrols robots…And it’s so simple… Possibly explaining why it’s so usedMany interesting models build on it…An example of a Gaussian BN (more on this later)6 Multivariate GaussianMean vector:Covariance matrix:7 Conditioning a GaussianJoint Gaussian:p(X,Y) ~ N(;)Conditional linear Gaussian:p(Y|X) ~ N(Y|X; 2)8 Gaussian is a “Linear Model”Conditional linear Gaussian:p(Y|X) ~ N(0+X; 2)9 Conditioning a GaussianJoint Gaussian:p(X,Y) ~ N(;)Conditional linear Gaussian:p(Y|X) ~ N(Y|X; YY|X)10 Conditional Linear Gaussian (CLG) – general caseConditional linear Gaussian:p(Y|X) ~ N(0+X; YY|X)11 Understanding a linear Gaussian – the 2d caseVariance increases over time (motion noise adds up)Object doesn’t necessarily move in a straight line12 Tracking with a Gaussian 1p(X0) ~ N(0,0)p(Xi+1|Xi) ~ N( Xi + ; Xi+1|Xi)13 Tracking with Gaussians 2 – Making observationsWe have p(Xi)Detector observes Oi=oiWant to compute p(Xi|Oi=oi)Use Bayes rule:Require a CLG observation modelp(Oi|Xi) ~ N(W Xi + v; Oi|Xi)14 Operations in Kalman filterComputeStart with At each time step t:Condition on observationPrediction (Multiply transition model)Roll-up (marginalize previous time step)I’ll describe one implementation of KF, there are othersInformation filterX1O1 = X5X3X4X2O2 = O3 = O4 = O5 =15 Exponential family representation of Gaussian: Canonical Form16 Canonical formStandard form and canonical forms are related:Conditioning is easy in canonical formMarginalization easy in standard form17 Conditioning in canonical formFirst multiply:Then, condition on value B = y18 Operations in Kalman filterComputeStart with At each time step t:Condition on observationPrediction (Multiply transition model)Roll-up (marginalize previous time step)X1O1 = X5X3X4X2O2 = O3 = O4 = O5 =19 Prediction & roll-up in canonical formFirst multiply:Then, marginalize Xt:10-708 – Carlos Guestrin 200620 AnnouncementsLectures the rest of the semester:Special time: Monday Nov 27 - 5:30-7pm, Wean 4615A: Dynamic BNsWed. 11/30, regular class time: Causality (Richard Scheines)Friday 12/1, regular class time: Finish Dynamic BNs & Overview of Advanced TopicsDeadlines & Presentations:Project Poster Presentations: Dec. 1st 3-6pm (NSH Atrium)popular vote for best posterProject write up: Dec. 8th by 2pm by email 8 pages – limit will be strictly enforcedFinal: Out Dec. 1st, Due Dec. 15th by 2pm (strict deadline)21 What if observations are not CLG?Often observations are not CLGCLG if Oi = Xi + o + Consider a motion detector Oi = 1 if person is likely to be in the regionPosterior is not Gaussian22 Linearization: incorporating non-linear evidencep(Oi|Xi) not CLG, but…Find a Gaussian approximation of p(Xi,Oi)= p(Xi) p(Oi|Xi)Instantiate evidence Oi=oi and obtain a Gaussian for p(Xi|Oi=oi)Why do we hope this would be any good?Locally, Gaussian may be OK23 Linearization as integrationGaussian approximation of p(Xi,Oi)= p(Xi) p(Oi|Xi)Need to compute momentsE[Oi]E[Oi2]E[Oi Xi]Note: Integral is product of a Gaussian with an arbitrary function24 Linearization as numerical integrationProduct of a Gaussian with arbitrary functionEffective numerical integration with Gaussian quadrature methodApproximate integral as weighted sum over integration pointsGaussian quadrature defines location of points and weightsExact if arbitrary function is polynomial of bounded degreeNumber of integration points exponential in number of dimensions dExact monomials requires exponentially fewer pointsFor 2d+1 points, this method is equivalent to effective Unscented Kalman filterGeneralizes to many more points25 Operations in non-linear Kalman filterComputeStart with At each time step t:Condition on observation (use numerical integration)Prediction (Multiply transition model, use numerical integration)Roll-up (marginalize previous time step)X1O1 = X5X3X4X2O2 = O3 = O4 = O5 =26 What you need to know about Kalman Filters Kalman filterProbably most used BNAssumes Gaussian distributionsEquivalent to linear systemSimple matrix operations for computationsNon-linear Kalman filterUsually, observation or motion model not CLGUse numerical integration to find Gaussian approximation27 What if the person chooses different motion models?With probability , move more or less

View Full Document