Berkeley COMPSCI 260A - Sequential Importance Sampling Resampling

Unformatted text preview:

Sequential Importance Sampling ResamplingArnaud DoucetDepartments of Statistics & Computer ScienceUniversity of British ColumbiaA .D. () 1 / 30Sequential Importance SamplingWe use a structured IS distributionqn(x1:n)= qn1(x1:n1)qn(xnjx1:n1)= q1(x1)q2(x2jx1)qn(xnjx1:n1)so if X(i)1:n1 qn1(x1:n1)then we only need to sampleX(i)nX(i)1:n1 qnxnjX(i)1:n1to obtain X(i)1:n qn(x1:n)The importance weights are updated according town(x1:n)=γn(x1:n)qn(x1:n)= wn1(x1:n1)γn(x1:n)γn1(x1:n1)qn(xnjx1:n1)| {z }αn(x1:n)A .D. () 2 / 30Sequential Importance SamplingWe use a structured IS distributionqn(x1:n)= qn1(x1:n1)qn(xnjx1:n1)= q1(x1)q2(x2jx1)qn(xnjx1:n1)so if X(i)1:n1 qn1(x1:n1)then we only need to sampleX(i)nX(i)1:n1 qnxnjX(i)1:n1to obtain X(i)1:n qn(x1:n)The importance weights are updated according town(x1:n)=γn(x1:n)qn(x1:n)= wn1(x1:n1)γn(x1:n)γn1(x1:n1)qn(xnjx1:n1)| {z }αn(x1:n)A .D. () 2 / 30Sequential Importance SamplingAt time n = 1, sample X(i)1 q1()and set w1X(i)1=γ1X(i)1q1X(i)1.At time n  2sample X(i)n qnjX(i)1:n1compute wnX(i)1:n= wn1X(i)1:n1αnX(i)1:n.It follows thatbπn(dx1:n)=N∑i =1W(i)nδX(i)1:n(dx1:n),bZn=1NN∑i =1wnX(i)1:n.A .D. () 3 / 30Sequential Importance SamplingAt time n = 1, sample X(i)1 q1()and set w1X(i)1=γ1X(i)1q1X(i)1.At time n  2sample X(i)n qnjX(i)1:n1compute wnX(i)1:n= wn1X(i)1:n1αnX(i)1:n.It follows thatbπn(dx1:n)=N∑i =1W(i)nδX(i)1:n(dx1:n),bZn=1NN∑i =1wnX(i)1:n.A .D. () 3 / 30Sequential Importance SamplingAt time n = 1, sample X(i)1 q1()and set w1X(i)1=γ1X(i)1q1X(i)1.At time n  2sample X(i)n qnjX(i)1:n1compute wnX(i)1:n= wn1X(i)1:n1αnX(i)1:n.It follows thatbπn(dx1:n)=N∑i =1W(i)nδX(i)1:n(dx1:n),bZn=1NN∑i =1wnX(i)1:n.A .D. () 3 / 30Sequential Importance SamplingAt time n = 1, sample X(i)1 q1()and set w1X(i)1=γ1X(i)1q1X(i)1.At time n  2sample X(i)n qnjX(i)1:n1compute wnX(i)1:n= wn1X(i)1:n1αnX(i)1:n.It follows thatbπn(dx1:n)=N∑i =1W(i)nδX(i)1:n(dx1:n),bZn=1NN∑i =1wnX(i)1:n.A .D. () 3 / 30Sequential Importance SamplingAt time n = 1, sample X(i)1 q1()and set w1X(i)1=γ1X(i)1q1X(i)1.At time n  2sample X(i)n qnjX(i)1:n1compute wnX(i)1:n= wn1X(i)1:n1αnX(i)1:n.It follows thatbπn(dx1:n)=N∑i =1W(i)nδX(i)1:n(dx1:n),bZn=1NN∑i =1wnX(i)1:n.A .D. () 3 / 30Sequential Importance Sampling for State-Space ModelsState-space modelsHidden Markov process: X1 µ, Xkj(Xk 1= xk 1) f(jxk 1)Observation process: Ykj(Xk= xk) g(jxk)Assume we have received y1:n, we are interested in sampling fromπn(x1:n)= p(x1:njy1:n)=p(x1:n, y1:n)p(y1:n)and estimating p(y1:n)whereγn(x1:n)= p(x1:n, y1:n)= µ(x1)n∏k =2f(xkjxk 1)n∏k =1g(ykjxk),Zn= p(y1:n)=ZZµ(x1)n∏k =2f(xkjxk 1)n∏k =1g(ykjxk)dx1:n.A .D. () 4 / 30Sequential Importance Sampling for State-Space ModelsState-space modelsHidden Markov process: X1 µ, Xkj(Xk 1= xk 1) f(jxk 1)Observation process: Ykj(Xk= xk) g(jxk)Assume we have received y1:n, we are interested in sampling fromπn(x1:n)= p(x1:njy1:n)=p(x1:n, y1:n)p(y1:n)and estimating p(y1:n)whereγn(x1:n)= p(x1:n, y1:n)= µ(x1)n∏k =2f(xkjxk 1)n∏k =1g(ykjxk),Zn= p(y1:n)=ZZµ(x1)n∏k =2f(xkjxk 1)n∏k =1g(ykjxk)dx1:n.A .D. () 4 / 30Locally Optimal Importance DistributionThe optimal IS distribution qn(xnjx1:n1)at time n minimizing thevariance of wn(x1:n)is given byqoptn(xnjx1:n1)= πn(xnjx1:n1)and yields an incremental importance weight of the formαn(x1:n)=γn(x1:n1)γn1(x1:n1)For state-space models, we haveqoptn(xnjx1:n1)= p(xnjyn, xn1)=g(ynjxn)f(xnjxn1)p(ynjxn1),αn(x1:n)= p(ynjxn1).A .D. () 5 / 30Locally Optimal Importance DistributionThe optimal IS distribution qn(xnjx1:n1)at time n minimizing thevariance of wn(x1:n)is given byqoptn(xnjx1:n1)= πn(xnjx1:n1)and yields an incremental importance weight of the formαn(x1:n)=γn(x1:n1)γn1(x1:n1)For state-space models, we haveqoptn(xnjx1:n1)= p(xnjyn, xn1)=g(ynjxn)f(xnjxn1)p(ynjxn1),αn(x1:n)= p(ynjxn1).A .D. () 5 / 30SummarySequential Importance Sampling is a special case of ImportanceSampling.Importance Sampling only works decently for moderate size problems.Today, we discuss how to partially …x this problem.A .D. () 6 / 30SummarySequential Importance Sampling is a special case of ImportanceSampling.Importance Sampling only works decently for moderate size problems.Today, we discuss how to partially …x this problem.A .D. () 6 / 30SummarySequential Importance Sampling is a special case of ImportanceSampling.Importance Sampling only works decently for moderate size problems.Today, we discuss how to partially …x this problem.A .D. () 6 / 30ResamplingIntuitive KEY idea: As the time index n increases, the variance of theunnormalized weightsnwnX(i)1:notend to increase and all the massis concentrated on a few random samples/particles. We propose toreset the approximation by getting rid in a principled way of theparticles with low weights W(i)n(relative to 1/N) and multiply theparticles with high weights W(i)n(relative to 1/N).The main reason is that if a particle at time n has a low weight thentypically it will still have a low weight at time n + 1 (though I caneasily give you a counterexample).You want to focus your computational e¤orts on the “promising”parts of the space.A .D. () 7 / 30ResamplingIntuitive KEY idea: As the time index n increases, the variance of theunnormalized weightsnwnX(i)1:notend to increase and all the massis concentrated on a few random samples/particles. We propose toreset the approximation by getting rid in a principled way of theparticles with low weights W(i)n(relative to 1/N) and multiply theparticles with high weights W(i)n(relative to 1/N).The main reason is that if a particle at time n has a low weight thentypically it will still have a low weight at time n + 1 (though I caneasily give you a counterexample).You want to focus your computational e¤orts on the “promising”parts of the space.A .D. () 7 /


View Full Document

Berkeley COMPSCI 260A - Sequential Importance Sampling Resampling

Download Sequential Importance Sampling Resampling
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Sequential Importance Sampling Resampling and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Sequential Importance Sampling Resampling 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?