DOC PREVIEW
CMU CS 10708 - Kalman Filters Gaussian MNs

This preview shows page 1-2-21-22 out of 22 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Kalman Filters Gaussian MNsMultivariate GaussianConditioning a GaussianGaussian is a “Linear Model”Slide 5Conditional Linear Gaussian (CLG) – general caseUnderstanding a linear Gaussian – the 2d caseTracking with a Gaussian 1Tracking with Gaussians 2 – Making observationsOperations in Kalman filterExponential family representation of Gaussian: Canonical FormCanonical formConditioning in canonical formSlide 14Prediction & roll-up in canonical formWhat if observations are not CLG?Linearization: incorporating non-linear evidenceLinearization as integrationLinearization as numerical integrationOperations in non-linear Kalman filterCanonical form & Markov NetsWhat you need to know about Gaussians, Kalman Filters, Gaussian MNs1Kalman FiltersGaussian MNsGraphical Models – 10708Carlos GuestrinCarnegie Mellon UniversityDecember 1st, 2008Readings:K&F: 6.1, 6.2, 6.3, 14.1, 14.2, 14.3, 14.4,2Multivariate GaussianMean vector:Covariance matrix:3Conditioning a GaussianJoint Gaussian:p(X,Y) ~ N(;)Conditional linear Gaussian:p(Y|X) ~ N(Y|X; 2Y|X)4Gaussian is a “Linear Model”Conditional linear Gaussian:p(Y|X) ~ N(0+X; 2)5Conditioning a GaussianJoint Gaussian:p(X,Y) ~ N(;)Conditional linear Gaussian:p(Y|X) ~ N(Y|X; YY|X)6Conditional Linear Gaussian (CLG) – general caseConditional linear Gaussian:p(Y|X) ~ N(0+X; YY|X)7Understanding a linear Gaussian – the 2d caseVariance increases over time (motion noise adds up)Object doesn’t necessarily move in a straight line8Tracking with a Gaussian 1p(X0) ~ N(0,0)p(Xi+1|Xi) ~ N(Xi +  ; Xi+1|Xi)9Tracking with Gaussians 2 – Making observationsWe have p(Xi)Detector observes Oi=oiWant to compute p(Xi|Oi=oi)Use Bayes rule:Require a CLG observation modelp(Oi|Xi) ~ N(W Xi + v; Oi|Xi)10Operations in Kalman filterComputeStart with At each time step t:Condition on observationPrediction (Multiply transition model)Roll-up (marginalize previous time step)I’ll describe one implementation of KF, there are othersInformation filterX1O1 = X5X3X4X2O2 = O3 = O4 = O5 =11Exponential family representation of Gaussian: Canonical Form12Canonical formStandard form and canonical forms are related:Conditioning is easy in canonical formMarginalization easy in standard form13Conditioning in canonical formFirst multiply:Then, condition on value B = y14Operations in Kalman filterComputeStart with At each time step t:Condition on observationPrediction (Multiply transition model)Roll-up (marginalize previous time step)X1O1 = X5X3X4X2O2 = O3 = O4 = O5 =15Prediction & roll-up in canonical formFirst multiply:Then, marginalize Xt:16What if observations are not CLG?Often observations are not CLGCLG if Oi =  Xi + o + Consider a motion detector Oi = 1 if person is likely to be in the regionPosterior is not Gaussian17Linearization: incorporating non-linear evidencep(Oi|Xi) not CLG, but…Find a Gaussian approximation of p(Xi,Oi)= p(Xi) p(Oi|Xi)Instantiate evidence Oi=oi and obtain a Gaussian for p(Xi|Oi=oi)Why do we hope this would be any good?Locally, Gaussian may be OK18Linearization as integrationGaussian approximation of p(Xi,Oi)= p(Xi) p(Oi|Xi)Need to compute momentsE[Oi]E[Oi2]E[Oi Xi]Note: Integral is product of a Gaussian with an arbitrary function19Linearization as numerical integrationProduct of a Gaussian with arbitrary functionEffective numerical integration with Gaussian quadrature methodApproximate integral as weighted sum over integration pointsGaussian quadrature defines location of points and weightsExact if arbitrary function is polynomial of bounded degreeNumber of integration points exponential in number of dimensions dExact monomials requires exponentially fewer pointsFor 2d+1 points, this method is equivalent to effective Unscented Kalman filterGeneralizes to many more points20Operations in non-linear Kalman filterComputeStart with At each time step t:Condition on observation (use numerical integration)Prediction (Multiply transition model, use numerical integration)Roll-up (marginalize previous time step)X1O1 = X5X3X4X2O2 = O3 = O4 = O5 =21Canonical form & Markov Nets22What you need to know about Gaussians, Kalman Filters, Gaussian MNsKalman filterProbably most used BNAssumes Gaussian distributionsEquivalent to linear systemSimple matrix operations for computationsNon-linear Kalman filterUsually, observation or motion model not CLGUse numerical integration to find Gaussian approximationGaussian Markov NetsSparsity in precision matrix equivalent to graph structureContinuous and discrete (hybrid) modelMuch harder, but doable and interesting (see


View Full Document

CMU CS 10708 - Kalman Filters Gaussian MNs

Documents in this Course
Lecture

Lecture

15 pages

Lecture

Lecture

25 pages

Lecture

Lecture

24 pages

causality

causality

53 pages

lecture11

lecture11

16 pages

Exam

Exam

15 pages

Notes

Notes

12 pages

lecture

lecture

18 pages

lecture

lecture

16 pages

Lecture

Lecture

17 pages

Lecture

Lecture

15 pages

Lecture

Lecture

17 pages

Lecture

Lecture

19 pages

Lecture

Lecture

42 pages

Lecture

Lecture

16 pages

r6

r6

22 pages

lecture

lecture

20 pages

lecture

lecture

35 pages

Lecture

Lecture

19 pages

Lecture

Lecture

21 pages

lecture

lecture

21 pages

lecture

lecture

13 pages

review

review

50 pages

Semantics

Semantics

30 pages

lecture21

lecture21

26 pages

MN-crf

MN-crf

20 pages

hw4

hw4

5 pages

lecture

lecture

12 pages

Lecture

Lecture

25 pages

Lecture

Lecture

25 pages

Lecture

Lecture

14 pages

Lecture

Lecture

15 pages

Load more
Download Kalman Filters Gaussian MNs
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Kalman Filters Gaussian MNs and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Kalman Filters Gaussian MNs 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?