UCB Claude Shannon In Memoriam Jean Walrand U C Berkeley www eecs berkeley edu wlr UCB Outline Claude Shannon Entropy Source Coding Channel Coding Separation Theorem UCB Claude Shannon 4 30 1916 2 24 2001 37 Boolean Algebra Logical Circuits 48 Mathematical Theory of Communications UCB Entropy How much information is required to convey the value of a random variable Key insight The quantity of information is related to the uncertainty of that value Example 1 Coin Flip One bit of information Example 2 Two coin flips 2 bits Example 3 N equally likely values log 2 N bits UCB Source Coding How do we encode the values to be able to convey them with the minimum number of bits Key idea Look at sequence of outcomes X 1 X 2 X n where X m is in 1 2 K For n large there are only 2nH equally likely sequences where H is smaller than log2K In fact H pi log2 pi UCB Source Coding c d Example P 1 p 1 P 0 H p log2p 1 p log2 1 p H 1 0 0 0 5 1 p UCB Source Coding c d Thus for large n 2n n bit words 2nH equally likely n bit words UCB Channel Capacity Question How fast can one transmit bits reliably through a noisy channel Na ve answer No reliable transmission is possible Shannon s formulation What is the possible rate in the long term if one wants the bit error rate to be arbitrarily small Shannon s answer Channel Capacity UCB Channel Capacity c d Sent 1 p 0 Example 1 Received 0 p p 1 1 p C 1 0 0 0 5 1 p UCB Channel Capacity c d Justification Choose 2nK n bit words fair coin flips 2nH equally likely n bit words for one word sent 2n equally likely n bit words 2n 2nH 2n 1 H 2nC distinguishable codewords UCB Separation Theorem Source with Entropy H bits per symbol Channel with Capacity C bps Can send C H symbols per seconds First code symbols n symbols nH bits Then code channel send bits with suitable codewords Hence Separate source and channel coding
View Full Document