New version page

# UMB ENEE 620 - FINAL EXAM

Pages: 3

## This preview shows page 1 out of 3 pages.

View Full Document

End of preview. Want to read all 3 pages?

View Full Document
Unformatted text preview:

ENEE620. Final examination, 5/20/2019. Instructor: A. BargREAD THIS:• The exam consists of five problems. Each problem is 10 points. Max score=50 points• Your answers should be justified. Giving just the answer may result in no credit for the problem.• You do not have to copy problem statements into your paper.• If applicable, clearly identify the answer to the question.Problem 1. We are given n independent RVs X1, . . . , Xn, each of which is uniformly distributed on the unit interval[0, 1]. Define Yn≜∏ni=1Xi, n = 1, 2, . . . .(a) Find EYn, n = 1, 2, . . . .(b) Find E(Yn+1|Yn), n = 1, 2, . . . .(c) Find E(Yn+1|Y1, . . . , Yn), n = 1, 2, . . . .(d) Does the sequence (Yn, n = 1, 2, . . . ) converge in probability; if yes, what is the limit? Does it also converge a.s.?SOLUTION: (a) EYn= 2−n.(b) Since Yn+1= YnXn+1, and Xn+1is independent of Yn, we obtainE(Yn+1|Yn) = E(YnXn+1|Yn) = YnE(Xn+1|Yn) = YnEXn+1=12Yn.(c)E(Yn+1|Y1, . . . , Yn) = E(YnXn+1|Y1, . . . , Yn) = YnEXn+1=12Yn.(d) Let ϵ > 0, thenP (Yn> ϵ) ≤ EYn/ϵ = ϵ−12−n→ 0.Thus, Ynp→ 0. Moreover, also∑∞n=1P (Yn> ϵ) < ∞, and thus P (Yn> ϵ i.o.) = 0, or Yna.s.→ 0.Problem 2. (a) We consider a martingale Zn, n = 1, 2, . . . with respect to the natural filtration Fn= σ(Z1, . . . , Zn).Show that for 1 ≤ k < nE(Zn|Z1, . . . , Zk) = Zk.(b) Let Xn, n = 1, 2, . . . be a martingale with respect to the natural filtration σ(X1, . . . , Xn), and let Yn, n = 1, 2, . . .be another, independent martingale with respect to the natural filtration σ(Y1, . . . , Yn). Show that the sequence Zn=Xn+Ynforms a martingale with respect to σ(Z1, . . . , Zn), n ≥ 1 (use the tower property of conditional expectations).If the independence assumption is removed, is this statement still true? Prove or give a counterexample.SOLUTION: (a)Zk= E(Zk+1|Zk, . . . , Z1) = E(E(Zk+2|Zk+1, . . . , Z1)|Zk, . . . , Z1) = E(Zk+2|Zk, . . . , Z1),and generally,E(Zn|Zk, . . . , Z1) = E(E(Zn|Zn−1, . . . , Z1)|Zk, . . . , Z1) = E(Zn−1|Zk, . . . , Z1)= · · · = E(Zk+1|Zk, . . . , Z1) = Zk.(b) Since Xn, Ynare integrable by assumption, the RV Znis also integrable. Further,E(Xn|Z1, . . . , Zn−1) = E(E(Xn|Z1, . . . , Zn−1, X1, . . . , Xn−1)|Z1, . . . , Zn−1)= E(E(Xn|X1, . . . , Xn−1)|Z1, . . . , Zn−1) = E(Xn−1|Z1, . . . , Zn−1),where on the last line we used independence (given Xn−11, Xnis independent of Zn−11). In the same way,E(Yn|Z1, . . . , Zn−1) = E(Yn−1|Z1, . . . , Zn−1)and adding, we obtainE(Xn+ Yn|Z1, . . . , Zn−1) = E(Xn−1+ Yn−1|Z1, . . . , Zn−1) = Xn−1+ Yn−1as required.If the independence assumption is removed, then the claim is wrong (e.g., take Xn= aYn+ b).Problem 3. Let X1, X2, . . . , be a sequence of i.i.d. random variables with common mean 1, and let N(t), t ≥ 0 be aPoisson process with rate 2 that is independent of the RVs Xi, i ≥ 1. Define a random process Y (t), t ≥ 0 by settingY (0) = 0 and for t > 0Y (t) =N(t)∑i=1Xi.(a) Find the mean function EY (t) of the process Y.(b) Show that the process Y has independent increments.SOLUTION: (a) EY (t) = E(E(Y (t)|N (t))) = E(E(∑N(t)i=1Xi|N(t)))= E(∑N(t)i=1EX)= 2t · 1 = 2t.(b) Independence of increments follows from the independence of increments of the Poisson process. Indeed, letus fix the values 0 = t0< t1< t1< · · · < tn, and denote ∆k≜ Y (tk) − Y (tk−1), k = 1, . . . , n. Clearly∆k=∑N(tk)i=N(tk−1)+1Xi. Given the increments N(tk) − N(tk−1), the RVs ∆kare mutually conditionally inde-pendent. Moreover, ∆kis conditionally independent of all the other ∆j’s given the starting count N(tk) and thevalue N(tk) − N (tk−1). So, ∆kis conditionally independent of all the other ∆j’s given N (t1) − N (0), N (t2) −N(t1), . . . , N(tk) − N(tk−1). Since by the properties of the Poisson process, these increments are independent, weconclude that ∆1, . . . , ∆nare independent as well.Problem 4. Let X be a Bernoulli RV, P (X = 1) = P (X = −1) = 1/2, and letXn≜{X with probability 1 −1nenwith probability1n.Is it true that(a) Xnp→ X;(b) Xnd→ X (to receive credit, justify your answer directly, without appealing to other modes of convergence);(c) limn→∞E[(Xn− X)2] = 0 ?In each case justification is required.SOLUTION: (a) We writeP (|Xn− X| > ϵ) =(1 −1n)P (|Xn− X| > ϵ|Xn= X) +1nP (|Xn− X| > ϵ|Xn= en)1nP (|en− X| > ϵ) ≤1n,which shows that Xnp→ X.(b) part (a) already implies this, yet, we set out to prove this independently. We haveFX(x) =0 x < −11/2 −1 ≤ x < 11 x ≥ 1; FXn(x) =0 x < −112(1 −1n) −1 ≤ x < 11 −1n1 ≤ x < en1 x ≥ enThus, FXn(x) → FX(x) for all real x at which FXis continuous.(c)E(Xn− X) =(1 −1n)E((Xn− X)2|Xn= X) +1nE((Xn− X)2|Xn= en)=1n((en+ 1)22+(en− 1)22)→ ∞,so this is false.Problem 5. Assume that X and Y are jointly Gaussian processes with zero mean, autocorrelation functions RX(t) =RY(t) = e−|t|and cross-correlation function RXY(t) =12e−|t−3|. Assume further that for any n ≥ 1, any realt1, . . . , tnand any s > 0, the distribution of the vector (Xt1+s, . . . , Xtn+s, Yt1+s, . . . Ytn+s) does not depend on s.(a) Find the autocorrelation function of the random process Z(t) =12(X(t) + Y (t)), t ∈ R.(b) Is Z(t) a (strict-sense) stationary process?(c) Find the variance of the RV X(1) − 3Y (2). Then find P (X(1) < 3Y (2) − 1) (express your answer in terms of thestandard normal CDF Φ).SOLUTION: (a)RZ(s, t) = E[(1/4)(X(s) + Y (s))(X(t) + Y (t))] =14(RX(s − t) + RY(s − t) + RXY(s − t) + RY X(s − t)).Thus, RZ(s, t) depends only on s − t. Using RY X(y, x) = RXY(x, y), we obtainRZ(t) =1/4(2e−|t|+1/2e−|t−3|+1/2e−|t+3|).(b) Since EZ(t) = 0 and RZ(s, t) depends only on s − t, we conclude that Z is WSS. Further, Z by definition is aGaussian process, and for such processes, WSS implies stationarity.(c) We have P (X(1) < 3Y (2) − 1) = P (X(1) − 3Y (2) < −1), and X(1) − 3Y (2) is a zero-mean Gaussian RVwith varianceσ2= Var(X(1) − 3Y (2)) = RX(0) − 6RXY(−1) + 9RY(0) = 1 + 9 − 3e−4.Then (1/σ)(X(1) − 3Y (2)) ∼ N (0, 1), and P (X(1)−3Y (2)σ< −1/σ) =

View Full Document Unlocking...