# WUSTL ESE 523 - ee553ex105 (7 pages)

Previewing pages 1, 2 of 7 page document
View Full Document

## ee553ex105

Previewing pages 1, 2 of actual document.

View Full Document
View Full Document

## ee553ex105

79 views

Pages:
7
School:
Washington University in St. Louis
Course:
Ese 523 - Information Theory
Unformatted text preview:

ESE 523 Exam 1 EDITED Professor Joseph A O Sullivan October 26 2005 This is a closed book closed notes exam It is scheduled to last for 1 5 hours All students must do their own work and are bound by the SEAS honor code If you do not undertand a question clearly state the assumptions used in trying to nd a solution NAME Problem Score 1 2 3 4 5 6 Total 1 1 15 points Let X1 X2 Xn be arbitrary discrete random variables with known joint probability distribution p x1 x2 xn In this question you will explore the information value of additional measurements a Future measurements Let h1 k H X1 X2 X3 Xk that is the entropy of X1 given measurements up to time k Show that h1 k h1 k 1 Denote the gain in information from the next measurement by i1 k 1 that is i1 k 1 h1 k h1 k 1 b Past measurements Let hn n k H Xn Xn 1 Xn 2 Xn k that is the entropy of Xn given measurements in the past to time n k Show that hn n k hn n k 1 Denote the gain in information from the next measurement by in n k 1 that is in n k 1 hn n k hn n k 1 c Show that i1 n in 1 That is the information about the past equals the information about the future 2 2 20 points A code designer claims to have designed a binary code for an alphabet of size fourteen with codelengths 2 3 4 4 5 5 5 6 6 6 6 6 6 6 a Can these codeword lengths correspond to a pre x code Justify your answer b Can these codeword lengths be the result of a Hu man encoding scheme Justify your answer c Suppose that a source has fourteen symbols that are equally likely What are the optimal binary codelengths d For the source with equally likely symbols what is the cost in average number of bits per symbol of using the code provided by the designer relative to the optimal code 3 3 15 points Suppose that there are eight horses in a set of i i d horse races The odds oi i 1 2 8 are 4 8 8 8 16 16 16 16 respectively Suppose that all horses are equally likely to win a Assuming that all money must be bet every iteration what is the optimum doubling rate b Suppose that you do not know that the true distribution is uniform Suppose that you bet a Dutch 8 book That is you bet bi o1i on each horse and retain fraction 1 i 1 bi as cash What is the doubling rate What is the loss in doubling rate relative to the optimal strategy 4 4 20 points Consider a channel whose input and output alphabets have cardinality four The transition probabilities Q yj xi Qij with i indexing rows and j indexing columns are 0 75 0 25 0 0 0 0 75 0 25 0 1 Q 0 0 0 75 0 25 0 25 0 0 0 75 a What is the capacity of this channel What distribution achieves the maximum b Suppose that only the rst and the third inputs are used What is the probability of error What is the capacity of this system 5 5 15 points Consider a stationary Markov chain with four states The transition probabilities do not depend on time and are given by the matrix P Xk j Xk 1 i Qij with i indexing rows and j indexing columns where 0 75 0 25 0 0 0 0 75 0 25 0 2 Q 0 0 0 75 0 25 0 25 0 0 0 75 Find the entropy rate of this Markov chain 6 6 15 points Let A B X and Y be arbitrary discrete valued random variables Prove that I A B I X A I Y B I X Y A B For extra credit interpret the condition for equality in terms of independence of random variables 7 3

View Full Document

## Access the best Study Guides, Lecture Notes and Practice Exams Unlocking...
Sign Up

Join to view ee553ex105 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?