October 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 1 ESE 523 Information Theory Lectures 15-16 Joseph A. O’Sullivan Samuel C. Sachs Professor Electrical and Systems Engineering Washington University 211 Urbauer Hall 2120E Green Hall 314-935-4173 [email protected] 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 2 Outline: Gaussian Channel Coding Theorem Gaussian Channel Theorem statement (notation, rates, etc.) Forward part (achievability) Converse Extensions Feedback capacity Parallel channels Colored noise Continuous time channel: LTI plus stationary noise Problems Chapter 9October 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 3 Gaussian Channel Output equals input plus white Gaussian noise, variance N. Input is power limited to P Units are implicitly included W Xn Yn W Channel p(y|x) Encoder Decoder ^ 21, ~ (0, )1niiY X Z Z NXPnNOctober 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 4 A Channel Code: same as for discrete case An (M,n) code consists of An index set {1, 2, …, M}. An encoding function Xn: {1, 2, …, M} Rn such that the power constraint is satisfied. A decoding function g: Rn {1, 2, …, M}. The probability of error is Maximal probability of error Average probability of error Equally likely codewords ()1,2,...()1( ) | ( )max1( ) n n nwnwwMMnewwnP g Y w X X wPMP g Y W October 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 5 Achievability and Theorem The rate R of an (M,n) code is R=logM/n bits per transmission. The rate R is achievable if there exists a sequence of (2nR,n) codes such that λ(n) 0 as n ∞. The capacity is the supremum of achievable rates. Theorem: The capacity of a Gaussian channel with power constraint P and noise variance N is bits per transmission. NPC 1log21Proof Strategy Random coding: Gaussian codes, i.i.d. elements, independent codewords Variance of the elements is P-ε Reveal codebook to the encoder and decoder Encoding: select codeword w Decoding: typical set decoding Select codeword Xn(w) if (Xn(w),Yn) are jointly typical, i.e. in Aε(n) (X,Y) , and no other codeword is jointly typical with Yn Performance analysis Define events Ei ={(Xn(i),Yn) in Aε(n) (X,Y) } Use symmetry of codebook construction Error event is a union of events; union bound October 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 6Gaussian Typical Set October 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 7 22()1222( ) 211221211( ) : log11If ( ) , then log 2 ,221:,2where , and .log ( ) ( ), i.i.d.,1()nn n niixnn n niiiniiA X x f x h Xnf x e h X eA X x x PnPeX w f xP X wn Gaussian AEP Part 1:RR(), for all large enough( ( ) ( )) , for all large enoughnnPnP X w A X nOctober 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 8 Proof (forward part) Based on random coding. Generate a random (2nR,n) code with i.i.d. entries, Gaussian, zero mean, variance P-ε. Reveal codebook to encoder and decoder. Generate a random message Transmit Xn(w) The decoder receives Yn Use typical set decoding 121212()21(1) (1) ... (1)(2) (2) ... (2)...(2 ) (2 ) ... (2 )( ( ) ( )) , for all large enough1( ) , for all large enoughnnnR nR nRnnnniix x xx x xx x xP X w A X nP X w P nnCnRwWP 2)(1( | ( )) ( | ( ))nnniiip y X w p y x wOctober 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 9 Proof of Forward Part: Typical Set Decoding and Probability of Error Typical set decoding: select Xn(w) if It is jointly typical with Yn It satisfies the power constraint; and No other codeword is jointly typical with Yn. Compute the probability of error, averaged over all codes. Symmetry w.r.t. code index Define error events ()21211()0()0 1 22ˆ( ) ( )1()21()2( ) ( | 1)(1) ( )( ( ), ) ( , )nRnRnRnewnRwwnRwnnn n niWWP E PEEE P WE X A XE X i Y A X YE E E E CCCCEECCCCEEOctober 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 10 Proof (forward part) Use the union bound Average probability of error, averaged over all codes is small implies at least one code has that average error probability Delete half the codewords with the highest probability of error λ(n) ≤ 6ε Arbitrary ε implies there exists a sequence of codes such that the probability of error goes to zero as n goes to infinity ()0()0 1 220 1 2( ; ) 3(1) ( )( ( ), ) ( , )( | 1) 2 1223 , if ( ; ) 3 and is large enoughnRnnn n ninRn I X YnRE X A XE X i Y A X YE E E EP W P E P E P ER I X Yn EEOctober 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 11 ()1,2,...()()()11( ) | ( )max()( | ) 1( ) ( ; ) ( | )( ; ) 1 Fano's Ineqality( ; ) ( ) ( | ) ( ) ( )( ) ( ) Independence boun n nwnwwMnnennennnnen n n n n nnniiiiP g Y w X X wP P g Y WH W Y nRPnR H W I W Y H W YI W Y nRPI W Y h Y h Y X h Y h Zh Y h Z 1nd; indep. noiseFor any codebook satisfying ( ; ) log 1the power constraint2niiinPI Y XN Proof of Converse Assume a sequence of (2nR,n) codes exists such that λ(n) 0 as n ∞. Then Pe(n) 0 also. W Xn(W) Yn W Channel p(y|x) Encoder Decoder ^October 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 12 Converse of Gaussian channel coding theorem Thus, if R is achievable, R ≤ C. ()()()( ; ) 1log 1 121 1 1log 1 log 122nnenenennR I W Y nRPnPnR nRPNPPR RPN n N October 17, 19, 2013 J. A. O'Sullivan ESE 523 Lecture 15-16 13 Converse
View Full Document