DOC PREVIEW
MIT 6 453 - Problem Set 1

This preview shows page 1-2 out of 7 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

MIT OpenCourseWarehttp://ocw.mit.edu 6.453 Quantum Optical Communication Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.453 Quantum Optical Communication Problem Set 1 Fall 2008 Issued: Thursday, September 4, 2008 Due: Thursday, September 11, 2008 Reading: For probability review: Chapter 3 of J. H. Shapiro, Optical Progagation, Detection, and Communication, For linear algebra review: Section 2.1 of M. A. Nielsen and I. L. Chuang, Quantum Computation and Quantum Information. Problem 1.1 Here we shall verify the elementary properties of the 1-D Gaussian probability density function (pdf) e−(X−m)2/2σ2 px(X) = .√2πσ2 (a) By converting from rectangular to polar coordinates, using X − m = R cos(Φ) and Y − m = R sin(Φ), show that �� �2 � �∞ dX e−(X−m)2/2σ2 = ∞ dX ∞ dY e−(X−m)2/2σ2−(Y −m)2/2σ2 = 2πσ2 , −∞ −∞ −∞ thus verifying the normalization constant for the Gaussian pdf. (b) By completing the square in the exponent within the integrand, � ∞ ejvX−(X−m)2/2σ2 dX ,√2πσ2 −∞ verify that Mx(jv) = ejvm−v2σ2/2 , is the characteristic function associated with the Gaussian pdf. (c) Differentiate Mx(jv) to verify that E(x) = m; differentiate once more to verify that var(x) = σ2 . Problem 1.2 Here we shall verify the elementary properties of the Poisson probability mass function (pmf), n m Px(n) = e−m , for n = 0, 1, 2, . . . , and m ≥ 0. n! 1(a) Use the power series ∞z e = , n! Mx(jv) = exp[m(ejv − 1)]. that var(x) = m. � =0nto verify that the Poisson pmf is properly normalized. z(b) Use the power series for to verify that eis the characteristic function associated with the Poisson pmf. (c) Differentiate M (jv) to verify that E( ) = ; differentiate once more to verify x mxLet be a Rayleigh random variable, i.e., has pdf x x Problem 1.3 n z px(X) = ⎧⎨ ⎩ σX 2 e−X2/2σ2 , for X ≥ 0 0, otherwise, and let y = x2 . (a) Find py(Y ), the pdf of y. (b) Find my and σy 2, the mean and variance of the random variable y. Problem 1.4 Let x and y be statistically independent, identically distributed, zero-mean, variance σ2, Gaussian random variables, i.e., the joint pdf for x and y is, e−X2/2σ2−Y 2/2σ2 px,y(X, Y ) = . 2πσ2 Suppose we regard (x, y) as the Cartesian coordinates of a point in the plane, and let (r, φ) be the polar-coordinate representation of this point, viz., x = r cos(φ) and y = r sin(φ) for r ≥ 0 and 0 ≤ φ < 2π (a) Find pr,φ(R, Φ), the joint pdf of r and φ. (b) Find the marginal pdfs, pr(R) and pφ(Φ), of these random variables, and prove that r and φ are statistically independent random variables. 2� Problem 1.5 Let N, x be joint random variables. Suppose that x is exponentially distributed with mean m, i.e., ⎧ ⎨ e−X/m , for x ≥ 0, px(X) = m ⎩ 0, otherwise, is the pdf of x. Also suppose that, given x = X, N is Poisson distributed with mean value x, i.e., the conditional pmf of N is, Xn PN|x( n | x = X) = n! e−X , for n = 0, 1, 2, . . . (a) Use the integral formula, ∞ dZZn e−Z = n!, for n = 0, 1, 2, . . . , 0 (where 0! = 1) to find PN (n), the unconditional pmf of N. (b) Find MN (jv), the characteristic function associated with your unconditional pmf from (a). (c) Find E(N) and var(N), the unconditional mean and variance of N, by differ-entiating your characteristic function from (b). Problem 1.6 Let x, y be jointly Gaussian random variables with zero-means mx = my = 0, identical variances σx 2 = σy 2 = σ2, and nonzero correlation coefficient ρ. Let w, z be two new random variables obtained from x, y by the following transformation, w = x cos(θ) + y sin(θ) z = −x sin(θ) + y cos(θ), for θ a deterministic angle satisfying 0 < θ < π/2. (a) Show that this transformation is a rotation in the plane, i.e., (w, z) are obtained from (x, y) by rotation through angle θ (b) Find pw,z(W, Z) the joint pdf of w and z. (c) Find a θ value such that w and z are statistically independent. 3� � � �� Problem 1.7 Here we shall examine some of the eigenvalue/eigenvector properties of an Hermitian matrix. Let x be an N-D column vector of complex numbers whose nth element is xn, let A be an N × N matrix of complex numbers whose ijth element is aij , and let † denote conjugate transpose so that x† = x∗ x∗ x∗ and A† is an N × N1 2 N··· matrix whose ijth element is a∗ ji. (a) Find the adjoint of A, i.e., the matrix B which satisfies (By)†x = y†(Ax) for all x, y ∈ CN , where CN is the space of N-D vectors with complex-valued elements. If B = A, for a particular matrix A, we say that A is self-adjoint, or Hermitian. Assume that A is Hermitian for parts (b)–(d) (b) Let A have eigenvalues { µn : 1 ≤ n ≤ N } and normalized eigenvectors { φ :n 1 ≤ n ≤ N } obeying Aφ = µnφ , for 1 ≤ n ≤ N.n nφ†φ = 1, for 1 ≤ n ≤ N.nn Show that µn is real valued for 1 ≤ n ≤ N. (c) Show that if µn �= µm then φ†nφm = 0, i.e., eigenvectors associated with distinct eigenvalues are orthogonal. (d) Suppose there are two linearly independent eigenvectors, φ and φ� which have the same eigenvalue, µ. Show that two orthogonal vectors, θ and θ� can be constructed satisfying, Aθ = µθ, Aθ� = µθ�, θ†θ� = 0. (e) Because of the results of parts (c) and (d), we can assume that { φ : 1 ≤ n ≤n N } is a complete orthornormal (CON) set of vectors on CN , i.e., 1, for n = m, φ†φ = nm 0, for n = m. Let IN be the identity matrix on this space. Show that NIN = φnφ†n. n=1 4� � � � � � � � � Show that NA = µnφnφ†n. n=1 Problem 1.8 Here we introduce the notion of overcompleteness. Consider 2-D real Euclidean space, R2, i.e., the space of 2-D column vectors x where xT = x1 x2 , with x1 and x2 being real numbers. Define three vectors as follows: � √3/2 � � 0 � � −√3/2 � x1 = , x2 = , x3 = . −1/21 −1/2 (a) Make a labeled sketch of these three vectors on an (x1, x2) plane, and find xTn xm for 1 ≤ n, m ≤ 3. Are these three vectors normalized (unit length)? Are they orthogonal? (b) Show that any two of {x1, x2, x3} form a basis for the space R2, i.e., any y ∈ R2 can be expressed …


View Full Document

MIT 6 453 - Problem Set 1

Download Problem Set 1
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Problem Set 1 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Problem Set 1 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?