DOC PREVIEW
CMU CS 10708 - Dynamic models 1

This preview shows page 1-2-3-20-21-40-41-42 out of 42 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Dynamic models 1Kalman filters, linearization, Switching KFs, Assumed density filtersAnnouncementAdventures of our BN heroThe Kalman FilterExample of KF – SLATSimultaneous Localization and TrackingExample of KF – SLAT Simultaneous Localization and TrackingMultivariate GaussianConditioning a GaussianGaussian is a “Linear Model”Conditioning a GaussianConditional Linear Gaussian (CLG) – general caseUnderstanding a linear Gaussian – the 2d caseTracking with a Gaussian 1Tracking with Gaussians 2 – Making observationsOperations in Kalman filterCanonical formConditioning in canonical formOperations in Kalman filterPrediction & roll-up in canonical formWhat if observations are not CLG?Linearization: incorporating non-linear evidenceLinearization as integrationLinearization as numerical integrationOperations in non-linear Kalman filterWhat if the person chooses different motion models?The moonwalkWhat if the person chooses different motion models?Switching Kalman filterInference in switching KF – one stepMulti-step inferenceVisualizing growth in number of GaussiansComputational complexity of inference in switching Kalman filtersBounding number of GaussiansCollapsing Gaussians – Single Gaussian from a mixtureCollapsing mixture of Gaussians into smaller mixture of GaussiansOperations in non-linear switching Kalman filterAssumed density filteringWhen non-linear KF is not good enoughReparameterized KF for SLATWhen a single Gaussian ain’t good enoughApproximating non-linear KF with mixture of GaussiansWhat you need to knowKoller & Friedman: Chapter 16Jordan: Chapters 13, 15Uri Lerner’s Thesis: Chapters 3,9Dynamic models 1Kalman filters, linearization, Switching KFs, Assumed density filtersProbabilistic Graphical Models – 10708Carlos GuestrinCarnegie Mellon UniversityNovember 16th, 2005Announcement Special recitation lectures Pradeep will give two special lectures  Nov. 22 & Dec. 1: 5-6pm, during recitation Covering: variational methods, loopy BP and their relationship Don’t miss them!!!Adventures of our BN hero Compact representation for probability distributions Fast inference Fast learning Approximate inference But… Who are the most popular kids?1. Naïve Bayes2 and 3. Hidden Markov models (HMMs)Kalman FiltersThe Kalman Filter An HMM with Gaussian distributions Has been around for at least 50 years Possibly the most used graphical model ever It’s what does your cruise control tracks missiles controls robots … And it’s so simple… Possibly explaining why it’s so used Many interesting models build on it… Review and extensions todayExample of KF –SLATSimultaneous Localization and Tracking[Funiak, Guestrin, Paskin, Sukthankar ’05] Place some cameras around an environment, don’t know where they are Could measure all locations, but requires lots of grad. student (Stano) time Intuition: A person walks around If camera 1 sees person, then camera 2 sees person, learn about relative positions of camerasExample of KF –SLAT Simultaneous Localization and Tracking[Funiak, Guestrin, Paskin, Sukthankar ’05]Multivariate GaussianMean vector:Covariance matrix:Conditioning a Gaussian Joint Gaussian: p(X,Y) ~ N(µ;Σ) Conditional linear Gaussian: p(Y|X) ~ N(µY|X; σ2)Gaussian is a “Linear Model” Conditional linear Gaussian: p(Y|X) ~ N(β0+βX; σ2)Conditioning a Gaussian Joint Gaussian: p(X,Y) ~ N(µ;Σ) Conditional linear Gaussian: p(Y|X) ~ N(µY|X; ΣYY|X)Conditional Linear Gaussian (CLG) –general case Conditional linear Gaussian: p(Y|X) ~ N(β0+ΒX; ΣYY|X)Understanding a linear Gaussian –the 2d caseVariance increases over time (motion noise adds up)Object doesn’t necessarily move in a straight lineTracking with a Gaussian 1 p(X0) ~ N(µ0,Σ0) p(Xi+1|Xi) ~ N(Β Xi+ β; ΣXi+1|Xi)Tracking with Gaussians 2 –Making observations We have p(Xi) Detector observes Oi=oi Want to compute p(Xi|Oi=oi) Use Bayes rule: Require a CLG observation model p(Oi|Xi) ~ N(W Xi+ v; ΣOi|Xi)Operations in Kalman filter Compute Start with  At each time step t: Condition on observation Prediction (Multiply transition model) Roll-up (marginalize previous time step) I’ll describe one implementation of KF, there are others Information filterX1O1= X5X3X4X2O2= O3= O4= O5=Canonical form Standard form and canonical forms are related: Conditioning is easy in canonical form Marginalization easy in standard formConditioning in canonical form First multiply: Then, condition on value B = yOperations in Kalman filterX1O1= X5X3X4X2O2= O3= O4= O5=  Compute Start with  At each time step t: Condition on observation Prediction (Multiply transition model) Roll-up (marginalize previous time step)Prediction & roll-up in canonical form First multiply: Then, marginalize Xt:What if observations are not CLG? Often observations are not CLG CLG if Oi= Β Xi+ βo+ ε Consider a motion detector  Oi= 1 if person is likely to be in the region Posterior is not GaussianLinearization: incorporating non-linear evidence p(Oi|Xi) not CLG, but… Find a Gaussian approximation of p(Xi,Oi)= p(Xi) p(Oi|Xi) Instantiate evidence Oi=oiand obtain a Gaussian for p(Xi|Oi=oi) Why do we hope this would be any good? Locally, Gaussian may be OKLinearization as integration Gaussian approximation of p(Xi,Oi)= p(Xi) p(Oi|Xi) Need to compute moments E[Oi] E[Oi2] E[OiXi] Note: Integral is product of a Gaussian with an arbitrary functionLinearization as numerical integration Product of a Gaussian with arbitrary function Effective numerical integration with Gaussian quadrature method Approximate integral as weighted sum over integration points Gaussian quadrature defines location of points and weights Exact if arbitrary function is polynomial of bounded degree Number of integration points exponential in number of dimensions d Exact monomials requires exponentially fewer points For 2d+1 points, this method is equivalent to effective Unscented Kalman filter Generalizes to many more pointsOperations in non-linear Kalman filter Compute Start with  At each time step t: Condition on observation (use numerical integration) Prediction (Multiply


View Full Document

CMU CS 10708 - Dynamic models 1

Documents in this Course
Lecture

Lecture

15 pages

Lecture

Lecture

25 pages

Lecture

Lecture

24 pages

causality

causality

53 pages

lecture11

lecture11

16 pages

Exam

Exam

15 pages

Notes

Notes

12 pages

lecture

lecture

18 pages

lecture

lecture

16 pages

Lecture

Lecture

17 pages

Lecture

Lecture

15 pages

Lecture

Lecture

17 pages

Lecture

Lecture

19 pages

Lecture

Lecture

42 pages

Lecture

Lecture

16 pages

r6

r6

22 pages

lecture

lecture

20 pages

lecture

lecture

35 pages

Lecture

Lecture

19 pages

Lecture

Lecture

21 pages

lecture

lecture

21 pages

lecture

lecture

13 pages

review

review

50 pages

Semantics

Semantics

30 pages

lecture21

lecture21

26 pages

MN-crf

MN-crf

20 pages

hw4

hw4

5 pages

lecture

lecture

12 pages

Lecture

Lecture

25 pages

Lecture

Lecture

25 pages

Lecture

Lecture

14 pages

Lecture

Lecture

15 pages

Load more
Download Dynamic models 1
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Dynamic models 1 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Dynamic models 1 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?