DOC PREVIEW
WUSTL ESE 523 - exam198

This preview shows page 1 out of 3 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

EE 553: Exam 1Professor Joseph A. O'SullivanMarch 27, 1998This is a closed bo ok closed notes exam. It is scheduled to last for 1.5 hours. All students must do theirown work and are b ound by the SEAS honor co de. If you do not undertand a question, clearly state theassumptions used in trying to nd a solution.NAME:Potentially useful stu:1Xk=0k=11;jj<1 (1)1Xk=0kk;1=1(1;)2jj<1 (2)1 (20 p oints)Suppose that a channel has binary inputsXand binary outputsY. Suppose that the transition probabilitiesare:P(Y=1jX=1)=0:8P(Y=0jX=1)=0:2 (3)P(Y=1jX=0)=0:1P(Y=0jX=0)=0:9 (4)Find the capacity of this channel.12 (20 p oints)Suppose thatXis a geometric random variable, so the probabilitythatX=kis given byPX(k)=(1;)k k0 (5)where 0<<1.a. Find the entropyofX.b. Describe the typical set. Make sure that you convince me that you understand what the typical setlooks like for this problem.c. On average, howmany bits does it take to representni.i.d. random variablesX1X2:::Xn?d. Would it be a go od idea to use a Human code for data compression for a random variableX? Explainyour answer.3 (15 p oints)LetXbe a nite set and supp ose that three probability distributions,p(x)q(x)andr(x) are dened onX. Let 0  1. We will look at a few relativeentropies and entropies:D1=D(pjjr) (6)D2=D(qjjr) (7)D3=D(p+(1;)qjjr) (8)D4=D(p+(1;)rjjr) (9)D5=D(q+(1;)rjjr) (10)H1=H(p) (11)H2=H(q) (12)H3=H(p+(1;)q) (13)Determine whether each of the following statements is true, false, or whether there is insucient information(undetermined) to decide the truth of the statement. Do not guess as wrong answers will be penalized.Statement True False UndeterminedD3 D1+(1;)D2D4 D1D5 D1D4;D5 D1;D2H3 H1+(1;)H2H1 ;D124 (15 p oints)Suppose thatX=f012g, and thatp(0) = 1;1=16;1=32,p(1) = 1=16, andp(2) = 1=32.a. Describe a Human co de for the random variableX.b. Find a Shannon co de for the random variableX.c. Compare the two codes from parts a and b.d. Now suppose you were asked to nd a data compression code for 20 i.i.d. copies ofX. That is, thecode should b e dened for the random vectorX20=X1X2:::X20]:(14)What coding scheme would you use and why? Analyze the complexity of dening a Human code forX20.Approximately,what would b e the exp ected length of the co de for your co ding scheme?5 (15 Points)a. What is universal data compression and how do es it relate to Kolmogorov complexity?b. Determine a goo d upper bound on the Kolmogorov complexity of the following data model.Xn=X1X2:::Xn], and eachXiis binary. Every vectorXncan b e partitioned into subsets of strings. Forexample,Xn= 111001111100001000 (15)consists of six strings, these b eing111 00 11111 (16)0000 1 000:(17)The mo del for the data consists of sp ecifying whether the rst bit is 1 or 0, specifying the number of strings,and specifying the lengths of the strings.For what kind of data is this a go od mo del?For what kind of data is this a bad mo del?Does this work well as a model for algorithmically random sequences?Note that this model is essentially that used in fax


View Full Document

WUSTL ESE 523 - exam198

Download exam198
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view exam198 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view exam198 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?