DOC PREVIEW
WUSTL ESE 523 - ESE523Lec102013

This preview shows page 1-2-14-15-29-30 out of 30 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

ESE 523 Information Theory Lecture 12Information Theory and the NewsOutline Channel Coding TheoremConcept of Perfect CommunicationChannels“Information” CapacityBinary Symmetric ChannelZ-ChannelProperties of CA Channel CodeAchievability and Theorem Achievability Proof OutlineProof (forward part)Proof of Forward Part: Typical Set Decoding and Probability of ErrorJointly Typical Sequences Asymptotic Equipartition PropertyJoint AEPBound Probability of Error: Union Bound, Joint AEPThere exists a code with low average error probability Average Probability of Error to Maximum Probability of ErrorProof of Forward PartAchievability and Theorem ConverseConverseFano’s InequalitySecond LemmaEquality in ConverseFeedback CapacityFeedback can simplify encoding: Binary Erasure Channel ExampleFeedback can simplify encoding: Binary Erasure Channel ExampleProblems in class10/08/13 J. A. O'Sullivan ESE 523 Lecture 12 1ESE 523 Information TheoryLecture 12Joseph A. O’SullivanSamuel C. Sachs ProfessorElectrical and Systems EngineeringWashington University2120E Green Hall211 Urbauer [email protected] Theory and the News http://www.insidescience.org/blog/2013/09/13/information-theory-counts-size-animal-languages Information Theory Counts Size Of Animal Languages The frequency and repetition of symbols yield a measure of the information content in human languages. The symbols in English are the 26 letters of the alphabet, plus a space character. For animal modes of communication, however, figuring out the symbols can be a bit trickier, and researchers don’t have the benefit of huge animal language libraries that they can mine. …he uses the term “N-gram.” As an independent investigator affiliated with the Citizen Scientists League, Smith has previously used statistical methods to probe complex linguistic systems like Meroitic, an extinct and undeciphered East African language. The dolphin's vocabulary has approximately 36 “words,” while the figure for whales is about 23; the starling song repertoire is estimated at 119 to 202 songs.10/08/13 J. A. O'Sullivan ESE 523 Lecture 12210/08/13 J. A. O'Sullivan ESE 523 Lecture 123OutlineChannel Coding Theorem Concept of perfect communication “Information” capacity Examples, properties Definition of achievability Jointly typical pairs of sequences Set up channel coding theorem Channel coding theorem Proof of converse  Fano’s Inequality10/08/13 J. A. O'Sullivan ESE 523 Lecture 124Concept of Perfect Communication Perfect communication is possible over any channel.achievableinformation transfer0);(output input >YXIYXWXnYnWChannelp(y|x)Encoder Decoder^0, code of length such thatmax Pr(error | )cncεε∈∀> ∃<CCJ. A. O'Sullivan ESE 523 Lecture 1210/08/135Channels A discrete memoryless channel (DMC) is a triple (X,p(y|x),Y). The nth extension of a discrete memoryless channel is the channel (Xn,p(yn|xn), Y n) where Without feedback, WXnYnWChannelp(y|x)Encoder Decoder^.,...,2,1 ),|(),|(1nkxypyxypkkkkk==−.)|()|(1∏==nkkknnxypxyp10/08/13 J. A. O'Sullivan ESE 523 Lecture 126“Information” Capacity Definition: The “information” capacity of a DMC is Remark: Later we will show that all rates R less than C are achievable, where).;(max)(YXICxp=[]1logCodebook CardinalityRn=10/08/13 J. A. O'Sullivan ESE 523 Lecture 127Binary Symmetric Channel(;) () (| )() () ( | )() ()1()xIXY HY HY XHYpxHYXxHY hqhq=−=−==−≤−∑XYqq1-q1-q0011Crossover probability qAchieved when P(X=0)=0.510/08/13 J. A. O'Sullivan ESE 523 Lecture 128Z-ChannelXY1/21/210011(;) () (| )( ) (0)0 (1)1(1 / 2) log(1 / 2) / 2 log( / 2)( ; ) max (1 / 2) log(1 / 2) / 2 log( / 2)qIXY HY HY XHY p pqqqqqIXY q qq qq=−=−−=− − − − −≤−− − − −H(Y|X=0)=0; H(Y|X=1)=1P(X=1)=q2*5log52qC==−10/08/13 J. A. O'Sullivan ESE 523 Lecture 129Properties of C C is nonnegative (I(X;Y) is nonnegative) C ≤log|X| C≤log|Y| I(X;Y) is a continuous function of p(x) I(X;Y) is a concave function of p(x) Important for optimization strategies such as Blahut-Arimoto Proved in an earlier lecture).;(max)(YXICxp=10/08/13 J. A. O'Sullivan ESE 523 Lecture 1210A Channel Code An (M,n) code for (X,p(y|x),Y) consists of An index set {1, 2, …, M}. An encoding function Xn: {1, 2, …, M} Æ Xn A decoding function g: Y nÆ {1, 2, …, M}. The probability of error is Maximal probability of error  Average probability of error Equally likely codewords(){}()()1,2,...()1() | ()max1() nnnwnwwMMnewwnPgY wX X wPMPgY Wλλλλ∈==≠====≠∑WXnYnWChannelp(y|x)EncoderXnDecoderg^10/08/13 J. A. O'Sullivan ESE 523 Lecture 1211Achievability and Theorem  The rate R of an (M,n) code is R=logM/n bits per transmission. The rate R is achievable if there exists a sequence of (2nR,n) codes such that λ(n)Æ 0 as n Æ ∞. The capacity of a discrete memoryless channel is the supremum of achievable rates. Theorem: All rates below capacity C are achievable. Conversely, any sequence of (2nR,n) codes with λ(n)Æ 0 as n Æ ∞ must have R ≤ C.1)(,0)();(max)(=≥=∑∈XxxpxpxpYXICAchievability Proof Outline Use a random code (i.i.d. elements of codewords) Find the average probability of error, averaged over all codewords and over all codes. If the average probability of error is low, at least one code has low average probability of error (averaged over all codewords). Throwing away the half of the codewords with the highest probability of error gives a code with maximum probability of error low and the same asymptotic rate.10/08/13 J. A. O'Sullivan ESE 523 Lecture 121210/08/13 J. A. O'Sullivan ESE 523 Lecture 1213Proof (forward part) Based on random coding. Fix p(x). Generate a random (2nR,n) code with i.i.d. entries drawn from p(x).  Reveal codebook to encoder and decoder. Generate a random message  Transmit Xn(w) The decoder receives Yn Use typical set decoding121212211(1) (1) ... (1)(2) (2) ... (2)...(2 ) (2 ) ... (2 )() (())nRnnnR nR nRnniwixx xxx xxx xPpxw==⎡⎤⎢⎥⎢⎥=⎢⎥⎢⎥⎣⎦=∏∏CC nRwWP−== 2)(∏==niiinnwxypwXyP1))(|())(|(10/08/13 J. A. O'Sullivan ESE 523 Lecture 1214Proof of Forward Part:Typical Set Decoding and Probability of Error Typical set decoding: select Xn(w) if it is jointly typical with ynand no other codeword is jointly typical with yn. Compute the probability of error,


View Full Document

WUSTL ESE 523 - ESE523Lec102013

Download ESE523Lec102013
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view ESE523Lec102013 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view ESE523Lec102013 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?