DOC PREVIEW
MIT 6 896 - Problem Set 2 -6.896

This preview shows page 1 out of 2 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Essential Coding Theory Madhu Sudan6.896Extended Due Date: Wednesday, October 2, 2002Problem Set 2Instructions: See PS 1.Problems1. Prove the noiseless coding theorem, and its converse. (But don’t turn in.)2. Consider a Markovian source of bits, where the source consists of a 6-cycle with three succes-sive vertices outputing 0, and three successive vertices outputting 1, with the probability ofeither going left (or right) from any vertex is exactly 1/2. Compute the rate of this source. (Iexpect an ab initio argument. Hopefully this will motivate you to look up Shannon’s generalmethod for computing the rate of a Markovian source.)3. Consider a binary channel whose input/output alphabet is {0, 1}, where a 0 is transmittedfaithfully as a 0 (with probability 1), but a 1 is transmitted as a 0 with probability12and a1 with probability 1/2. Compute the capacity of this channel. (You should prove this fromscratch using only simple probabilistic facts already stated/used in class - not by referring totools gleaned from other courses in information theory. For partial credit, you may just provea lower bound on the capacity. The higher your bound, the more the credit.)4. If there is a constructive solution to Shannon’s noisy coding theorem with E being a linearmap, then show that there is a constructive solution to Shannon’s noiseless coding theoremin the case where the source produces a sequence of independent bits of bias p.Clarifications:(a) The encoding and decoding functions used in the noiseless theorem should be polynomialtime computable, if the corresponding functions are polynomial time computable in thenoisy theorem.(b) The compression rate in the noiseless coding theorem should be arbitrarily close to H(p),assuming the rate of the encoding function in the coding theorem can be made arbitrarilyclose to 1 − H(p).5. Given codes C1and C2with encoding functions E1: {0, 1}k1→ {0, 1}n1and E2: {0, 1}k2→{0, 1}n2let E1⊗ E2: {0, 1}k1×k2→ {0, 1}n1×n2be the encoding function obtained as follows:View a message m as a k1× k2matrix. Encode the columns of m individually using thefunction E1to get an n1× k2matrix m0. Now encode the rows of m0individually using E2to get an n1× n2matrix that is the final encoding under E1⊗ E2of m. Let C1⊗ C2be thecode associated with E1⊗ E2.For i ≥ 3, let Hidenote the [2i− 1, 2i− i − 1, 3]2-Hamming code. Let Ci= Hi⊗ Ci−1withC3= H3be a new family of codes.1(a) Give a lower bound on the relative minimum distance of Ci. Does it go to zero as i → ∞?(b) Give a lower bound on the rate of Ci. Does it go to zero as i → ∞?(c) Consider the following simple decoding algorithm for Ci: Decode the rows of the rec’dvector recursively using the decoding algorithm for Ci−1. Then decode each columnaccording to the Hamming decoding algorithm. Let pidenote the probability of decodingerror of this algorithm on the Binary Symmmetric Channel with parameter p. Show thatthere exists a p > 0 such that pi→ 0 as i → ∞. (Hint: First show that pi≤


View Full Document

MIT 6 896 - Problem Set 2 -6.896

Download Problem Set 2 -6.896
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Problem Set 2 -6.896 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Problem Set 2 -6.896 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?