Unformatted text preview:

EE595A Introduction to Information Theory University of WashingtonWinter 2004 Dept. of Electrical EngineeringHandout 5: Problem Set 3: (due Fri. Feb 27th by 4:30pm)Prof: Jeff A. Bilmes <[email protected]> Lecture 10, Feb 18, 2004Book Problems Do problems 4.8,5.6,5.13,5.23,5.24,8.2,8.4,8.9 in C&T.Problem 1:a Define the geometric mean.b Why is the geometric mean called the geometric mean?c What is similar about the geometric mean and entropy?Problem 2:a Encode the string 001010010101010010101010101001010101 using the basic Lempel-Ziv algorithm we describedin class.b Encode the string 1100011011111000000010010111100110000111 using the basic Lempel-Ziv algorithm we de-scribed in class.c Can you characterize the difference between these two strings? Did you notice this difference immediately whenyou first glanced at them? Why or why not?Problem 3:You’ve discovered a random number generator that generates random strings of bits M1:n(Mi∈ {0, 1}) for any n.Each such bit string corresponds to sampled music recordings (one sample might be Miles Davis or John Coltrane,another might be Beethoven or Brahms, a third might be Eminem or the Sex pistols, etc.). Since this random sourcecorresponds to natural signals, we know it can not have maximum entropy, meaning that either H(Mi) < 1 orH(Mi:j) <Pjk=iH(Mk) or (more likely) both. Why is this the case?You give this bit string to your (best) friend who rather than listens to infinite free music (obviously not a Kazaasubscriber), does the following. She produces a new random bit string generator B1:nwhere Bi= MixorRi, andwhere Riis a coin flip with entropy H = 1 (i.e., Ri:nis an i.i.d. maximum entropy binary random process). xor is thexor function which produces the value 1 only if the two operands are not equal, and otherwise produces 0.Q1: Is B1:na maximum entropy binary random source in the same sense that R1:nis?If your answer is “no”, meaning that B1:nis not a maximum entropy source, prove that this is true (which means thatyou need to prove the H(Bi) < 1 and H(Bi:j) <Pjk=iH(Bk) conditions for all i and j.If your answer is “yes”, meaning that B is maximum entropy, then what is the entropy of the sequence N1:nconstructedas Ni= BixorRiwhere we assume that N1:nis constructed using the same instance of the random vector R1:nthatB1:nused when B1:nwas constructed. What is the sequence N? What happened when we combined these twosources R and B which by themselves are entirely random, to get N , and why does this happen? Which of R or Bcontains the information? Explain.Problem 4: Let X be a random variable with values in the set A = {1, . . . , M}, and let PX(m) = pmfor m ∈ A.Assume also that p1≥ p2≥ . . . ≥ pm.(a) Prove for any binary Huffman code that if p1> 2/5, then the corresponding symbol is assigned a codeword oflength 1 bit.5-15-2(b) Prove for any binary Huffman code that if p1< 1/3, then the corresponding symbol is assigned a codeword oflength ≥ 2 bits.Problem 5: Using the Laplace model that we outlined in class, describe the Arithmetic coding algorithm for encodinga random bit string of length n with k ones, given n and k. For the case of the string of length n = 4, show in completedetail the intervals corresponding to all source substrings of length 1 −


View Full Document

UW EE 595A - Study Guide

Download Study Guide
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Study Guide and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Study Guide 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?