Unformatted text preview:

1. Differential entropy.2. Jointly Gaussian distribution3. Upper Bound on Error exponents.4. Your question.MASSACHUSETTS INSTITUTE OF TECHNOLOGYDepartment of Electrical Engineering and Computer Science6.441 Transmission of Information—Spring 2006Homework 4 SolutionnameusernameApril 28, 20061. Differential entropy. For this question, the unit of the information measures are in nats unlessotherwise specified. (1bit = ln 2 nats)(a) Give an example where 0 < h(X, Y ) < h(X + Y ) < h(X) < I(X; Y ).Solution: Your Solution Here.(b) Compute the differential entropy of the following pdf’s in nats.i. Exponential. p(x) = λe−λxSolution: Your Solution Here.ii. Laplace. p(x) =12λe−λ|x|Solution: Your Solution Here.(c) Scalar entropy power inequality. If X and Y are two real scalar independent randomvariables, we have the following entropy power inequality,12πee2h(X)+12πee2h(Y )≤12πee2h(X+Y )≤ Var[X] + Var[Y ] ,or equivalentlyeh(X)+ eh(Y )≤ eh(X+Y )≤p2πe(Var[X] + Var[Y ]) (1.1)[See p.496 of Cover for a stronger result on random vector.]i. Prove the second part of the inequality in (1.1). i.e. e2h(X+Y )≤ Var[X] + Var[Y ].Solution: Your Solution Here.ii. Show that if we use entropy instead of differential entropy, the above inequality doesnot hold. Instead, prove the following inequality [caveat: entropy is in nats],e12(H(X+Y )+ln 2)≤ eH(X)+ eH(Y )≤ |X | + |Y | (1.2)maxneH(X), eH(Y )o≤ eH(X+Y )≤ |X ||Y | (1.3)Solution: Your Solution Here.6.441 Spring 2006 2(d) Uniformly random noise. Optional and open. Find the distribution p(x) of X takingvalues from the continuous interval [−1/2, 1/2] so that h(X + Z) is maximized whereZ ∼ U [−a/2, a/2] (i.e. uniformly distributed over the continuous interval [−a/2, a/2]).Notice that1amay not be an integer.Solution: Your Solution Here.2. Jointly Gaussian distribution OptionalLet z := [z1z2] be a zero-mean Gaussian random vector with the bivariate normal distributionp(z) =12π |Λz|12ez0Λ−1zz, where z0denotes the conjugate transpose of the sample vector z, andΛz:=hσ21σ1σ2ρσ2σ1ρ σ22iis the convariance matrix E[zz0].(a) Innovation form. Give the deterministic vector g = [g1g2] in terms of σ1, σ2and ρ such thatz2= g0z1σ1v= g1z1σ1+ g2v (2.1)where v ∼ N (0, 1) is independent of z1. [Hint: substitute (2.1) into E[z2]2and E[z1z2]before taking the expectation.]Hence, show that if z1is uncorrelated with z2(i.e. ρ = 0), then z1is independent of z2. Ingeneral, uncorrelated Gaussian random variables are independent and vice versa.Solution: Your Solution Here.(b) Whitening. Assuming Λzis invertible, give all possible matrices G such that (∃v := [v1v2] ∼N (0, I2))(z = Gv), where I2denotes the 2-by-2 identity matrix.[Hint: use Part a and the fact that U v is white Gaussian random vector for any unitarymatrix U .]Solution: Your Solution Here.(c) Differential entropy after linear scaling. Prove/disproveh(z) = h(v) + ln kGk (2.2)where kGk denotes the absolute value of the determinant of G. Comment on the casewhen G is singular. Optional. Generalize the result to arbitrary distributions.[Hint: Use the result in Part a and b, and the fact that kAU k = kAk for any unitarymatrix U .]Solution: Your Solution Here.(d) Differential entropy after linear filtering. In this part of the problem, we would like torelate z and v through a linear time-invariant filter as follows,z[n] =1Xm=0g[m]v[n − m] (2.3)6.441 Spring 2006 3where z[n] = z1+(n mod 2)and v[n] = v1+(n mod 2).i. Supp ose σ1= σ2= 1, give the choice of g[m] so that (2.3) holds.[Hint: restrict G to be a circulant matrix (i.e. a matrix of the forma bb a ) and assigng[0] = a and g[1] = b. ]Solution: Your Solution Here.ii. Explain why no choice of g[m] exists for (2.3) to hold iff σ16= σ2.Solution: Your Solution Here.[Hint: linear time-invariant (LTI) system preserves stationarity.]iii. Prove/disproveh(z) = h(v) + ln |G[0]G[1]| (2.4)where G[k] :=P1m=0g[m]e−jπmis the 2-point discrete Fourier transform (DFT) of g.Solution: Your Solution Here.3. Upper Bound on Error exponents. The aim of this question is to prove a decent upper boundon the error exponent of the binary symmetric channel.Let p denote the crossove r probability of the channel, and consider any encoding scheme E :{0, 1}Rn→ {0, 1}nof rate R, and let D be any decoding scheme. Denote by m the messagebeing encoded, by~X the encoding E(m) and by~Y the seqeunce received by the receiver. Denoteby Perrthe error probability Perr4= Prm,~Y ,~X=E(m)[D(Y ) 6= m]. Our goal is to compute somefunction U(R) (which doesn’t depend on E or D) such that lim supn→∞{−1nlog Perr} ≤ U(R).(a) Fix τ ∈ [0, 1/2]. Prove that there exists an integer k with τn ≤ k ≤ n/2 such that forevery ~x ∈ {0, 1}n−1nlog Pr[∆(~X,~Y ) = k|~X = ~x] → D[τ ||p].Solution: Your Solution Here.(b) Prove that the probability of successful decoding subject to the occurence of k errors (i.e.,subject to the condition ∆(~X,~Y ) = k) is at most2n2H(τ0)n·2Rn, where τ0= k/n.[Hint: Note that the probability of correct decoding is exactly equal to the probability thata message m is transmitted, and an error ~η occurs, where ~η and m satisfy the conditionD(E(m) + ~η) = m. In turn this can be expressed as the condition that the receivedsequence~Y and the error ~η satisfy ~η =~Y − E(D(~Y )). The above is an upper bound tothis probability subject to the condition that η has exactly k non-zero coordinates.]Solution: Your Solution Here.(c) Putting the above parts together, conclude that the probability of decoding error is atleast 2−U(R)·nfor U(R) = D[H−1(1 − R)||p].6.441 Spring 2006 44. Your question. Make an interesting question (with solutions if possible) on the recent materialscovered in class. A good question, for example, may be one that points out key concepts thatare often overlooked, or highlights the relationships among seemingly unrelated concepts. Openproblem that cannot be easily solved is encouraged.Suggested areas: joint source-channel coding, error exponent, Gaussian channels,...[n.b. Please submit your LATEX source if possible. Your question will be compiled together andhelp your classmates and subsequent students learn the


View Full Document

MIT 6 441 - Homework 4 Solution

Download Homework 4 Solution
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Homework 4 Solution and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Homework 4 Solution 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?