DOC PREVIEW
ISU EE 524 - RANDOM SIGNALS

This preview shows page 1-2-21-22 out of 22 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

RANDOM SIGNALSRandom VariablesA random variable x(ξ) is a mapping that assigns a real numberx to every outcome ξ from an abstract probability space. Themapping should satisfy the following two c onditions:• the interval {x(ξ) ≤ x} is an event in the abstract probabiltyspace for every x;• Pr[x(ξ) < ∞] = 1 and Pr[x(ξ) = −∞] = 0.Cumulative distribution function (cdf) of a random variablex(ξ):Fx(x) = Pr{x(ξ) ≤ x}.Probability density function (pdf):fx(x) =dFx(x)dx.ThenFx(x) =Zx−∞fx(x) dx.EE 524, Fall 2004, # 7 1Since Fx(∞) = 1, we have normalization condition:Z∞−∞fx(x)dx = 1.Several important properties:0 ≤ Fx(x) ≤ 1, Fx(−∞) = 0, Fx(∞) = 1,fx(x) ≥ 0,Z∞−∞fx(x) dx = 1.Simple interpretation:fx(x) = lim∆→0Pr{x − ∆/2 ≤ x(ξ) ≤ x + ∆/2}∆.EE 524, Fall 2004, # 7 2Expectation of an arbitrary function g(x(ξ)):E {g(x(ξ))} =Z∞−∞g(x)fx(x) dx.Mean:µx= E {x(ξ)} =Z∞−∞xfx(x) dx.Variance of a real random variable x(ξ):var{x} = σ2x= E {(x − E {x})2}= E {x2− 2xE {x} + E {x}2}= E {x2} − (E {x})2= E {x2} − µ2x.Complex random variables: A complex random variablex(ξ) = xR(ξ) + jxI(ξ).Although the definition of t he mean remains unchanged, thedefinition of variance changes for complex x(ξ):var{x} = σ2x= E {|x − E {x}|2}= E {|x|2− xE {x}∗− x∗E {x} + |E {x}|2}= E {|x|2} − |E {x}|2= E {|x|2} − |µx|2.EE 524, Fall 2004, # 7 3Random VectorsA real-valued vector containing N random variablesx(ξ) =x1(ξ)x2(ξ)...xN(ξ)is called a random N vector or a random vector whendimensionality is unimportant. A real-valued random vector≡ mapping from an abstract probability space to a vector-valued real space RN.A random vector is completely characterized by its jointcumulative distribution function, which is defined byFx(x1, x2, . . . , xN)4= P [{x1(ξ) ≤ x1}∩. . . ∩{xN(ξ) ≤ xN}]and is often written asFx(x) = P [x(ξ) ≤ x].A random vector can also be characterized by its jointEE 524, Fall 2004, # 7 4probability density function (pdf), defined as follows:fx(x) = lim∆x1→ 0∆x2→ 0...∆xN→ 0P [{x1< x1(ξ) ≤ x1+ ∆x1} ∩ . . . ∩ {xN< xN(ξ) ≤ xN+ ∆xN}]∆x1···∆xN=∂∂x1···∂∂xNFx(x).The functionfxi(xi) =Z···Z(N−1)fx(x) dx1···dxi−1dxi+1···dxNis known as marginal pdf and describes individual randomvariables.The cdf of x can be computed from the joint pdf as:Fx(x) =Zx1−∞···ZxN−∞fx(v) dv1dv2···dvN4=Zx−∞fx(v) dv.EE 524, Fall 2004, # 7 5Complex random vectors:x(ξ) = xR(ξ) + jxI(ξ) =xR,1(ξ)xR,2(ξ)...xR,N(ξ)+ jxI,1(ξ)xI,2(ξ)...xI,N(ξ).Complex random vector ≡ mapping from an abstract probabilityspace to a vector-valued complex space CN. The cdf of acomplex-valued random vector x(ξ) is defined as:Fx(x)4= P [x(ξ) ≤ x]4= P [{xR(ξ) ≤ xR} ∩ {xI(ξ) ≤ xI}]and its joint pdf is de fined asfx(x) = lim∆xR,1→ 0∆xI,1→ 0...∆xR,N→ 0∆xI,N→ 0P [{xR< xR(ξ) ≤ xR+ ∆xR} ∩ {xI< xI(ξ) ≤ xI+ ∆xI}]∆x1···∆xN=∂∂xR,1∂∂xI,1···∂∂xR,N∂∂xI,NFx(x).EE 524, Fall 2004, # 7 6The cdf of x can be computed from the joint pdf as:Fx(x) =ZxR,1−∞···ZxI,N−∞fx(v) dvR,1dvI,1···dvR,NdvI,N4=ZxN−∞fx(v) dv,where the single integral in the last expression is used as acompact notation for a multidimensional integral and shouldnot be confused with a complex contour integral.Note thatFx(x) =Zex−∞fx(ex) dex =Zx−∞fx(x) dx.whereex = [xTR, xTI]T.For two random variables, x = [x, y]T: fx(x) = fx,y(x, y).x and y are independent iffx,y(x, y) = fx(x) · fy(y) =⇒ E {xy} = E {x}E {y}.Expectation of a function g(x(ξ)):E {g(x)} =Z∞−∞g(x)fx(x)dx.EE 524, Fall 2004, # 7 7For two random variables, x = [x, y]T:E {g(x, y)} =Z∞−∞g(x, y)fx,y(x, y) dxdy.Correlation:Real correlation:rx,y= E {xy} =Z∞−∞Z∞−∞xyfx,y(x, y) dxdy.Real covariance:rx,y= E {(x − µx)(y − µy)}=Z∞−∞Z∞−∞(x − µx)(y − µy) fx,y(x, y) dxdy.Complex correlation:rx,y= E {xy∗} =Z∞−∞Z∞−∞xy∗fx,y(x, y) dxdy.Complex covariance:rx,y= E {(x − µx)(y − µy)∗}=Z∞−∞Z∞−∞(x − µx)(y − µy)∗fx,y(x, y) dxdy.EE 524, Fall 2004, # 7 8Covariance Matrix:Mean vector:µx= E {x}.Real covariance matrix:Rx= E {(x − E {x})(x − E {x})T}= E {xxT} − E {x}E {x}T,Rx= E {xxT} if E {x} = 0.Complex covariance matrix:Rx= E {(x − E {x})(x − E {x})H}= E {xxH} − E {x}E {x}H,Rx= E {xxH} if E {x} = 0.Observe the following prope rty of complex correlation:ri,k= E {xix∗k} = E {xkx∗i}∗= r∗k,i.EE 524, Fall 2004, # 7 9Then, for E {x} = 0:Rx= E {xxH} =r1,1r1,2··· ··· r1,Nr2,1r2,2··· ··· r2,N...............rN,1rN,2··· ··· rN,N=r1,1r1,2··· ··· r1,Nr∗1,2r2,2··· ··· r2,N...............r∗1,Nr∗2,N··· ··· rN,N.The covariance matrix is Hermitian. It is positive semidefinitebecausebHRxb = bHE {(x − E {x})| {z }z(x − E {x})H}b= bHE {zzH}b = E {|bHz|2} ≥ 0.Linear Transformation of Random VectorsLinear Transformation:y = g(x) = Ax.Mean Vector:µy= E {Ax} = Aµx.EE 524, Fall 2004, # 7 10Covariance Matrix:Ry= E {yyH} − µyµHy= E {AxxHAH} − AµxµHxAH= AE {xxH} − µxµHxAH= ARxxAH.EE 524, Fall 2004, # 7 11Gaussian Random VectorsGaussian random variables:fx(x) =1σx√2πexpn−(x − µx)22σ2xofor real x,fx(x) =1σ2xπexpn−|x − µx|2σ2xofor complex x.Real Gaussian random vectors:fx(x) =1(2π)N/2|Rx|1/2expn−12(x − µx)TR−1x(x − µx)o.Complex Gaussian random vectors:fx(x) =1πN|Rx|expn− (x − µx)HR−1x(x − µx)o.Symbolic notation for real and complex Gaussian randomvectors:x ∼ Nr(µx, Rx), real,x ∼ Nc(µx, Rx), complex.EE 524, Fall 2004, # 7 12A linear transformation of Gaussian vector is also Gaussian, i.e.ify = Axtheny ∼ Nr(Aµx, ARxAT) real,y ∼ Nc(Aµx, ARxAH) complex.EE 524, Fall 2004, # 7 13Complex Gaussian DistributionConsider joint pdf of real and imaginary part of a complexvector xx = u + jv.Assume z = [uT, vT]T. The 2n-variate Gaussian pdf of the(real!) vector z isfz(z) =1p(2π)2n|Rz|exp−12(z − µz)TR−1z(z − µz) ,whereµz=µuµv, Rz=RuuRuvRvuRvv.That isP [z ∈ Ω] =Zz∈ΩpZ(z) dz.EE 524, Fall 2004, # 7 14Complex Gaussian Distribution (cont.)Suppose Rzhappens to have a spe cial structure:Ruu= Rvvand Ruv= −Rvu.(Note that Ruv= RTvuby construction.) Then, we can definea


View Full Document

ISU EE 524 - RANDOM SIGNALS

Download RANDOM SIGNALS
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view RANDOM SIGNALS and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view RANDOM SIGNALS 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?