Unformatted text preview:

LECTURE 15 Last time: Feedback channel: setting up the prob-• lem Perfect feedback • Feedback capacity • Lecture outline Data compression • Joint source and channel coding theorem • Converse • Robustness • Brain teaser • Reading: Sct. 8.13.Data compression Consider coding several symbols together C : X n �→ D∗ expected codeword length is �xn∈X n PXn(xn)l(xn) optimum satisfies HD(Xn) ≤ L∗≤ HD(Xn) + 1 per symbol codeword length is HD(Xn) n ≤ L∗ n ≤ HD(Xn) n + 1 n Thus, we have that the rate R is lower bounded by entropyChannel coding Coding theorem: For any source of messages, any rate R below C is feasible, i.e achievable with low enough probability of error Ecodebooks,messages[Pe] ≤ 2−NEr(R,PX ) For the PX that yields capacity, Er(R, PX) is positive for R < C also the maximum probability of error is up-per bounded Converse: For any source of IID messages, any trans-mission at rate R above C is bounded away from 0 R ≤ n 1+ PeR + CJoint source and channel coding Is H ≤ C enough? Should coding be done jointly or not?Joint source channel coding Consider a sequence of input symbols Vn = (V1, . . . Vn) Assume that Vn = (V1, . . . Vn) satisfies the AEP We map the sequence onto a codeword X(Vn) The receiver receives Y (a vector) and cre-ates an estimate V�n = g(Y ) of Vn We consider the end-to-end probability of error: Pen = P (Vn =� V�n)Joint source channel coding A� (n) is a typical set if it is the set of se-quences in the set of all possible sequences vn n with probability: ∈ V 2−n(H(V )+�) ≤ PV n (vn) ≤ 2−n(H(V )−�) Since we assume that our V sequence satis-fies the AEP, for all �, there exists a typical set that has probability P r(A� (n)) ≥ 1 − � and has cardinality upper bounded by 2n(H(V )+�) We encode only those elements in the typ-ical set, the ones outside will contribute � to the probability of errorJoint source channel coding We require at most n(H(V ) + �) bits to describe the elements of the typical set Let us create a set of messages from these sequences The maximum probability of any message being decoded in error is arbitrarily close to 0 as long as R = H(V ) + � < C for all n LARGE ENOUGH Note: the size of the X may not be n, but it grows in n The receiver decodes to attempt to recover an element from the typical set Because we have assumed that the AEP is satisfied, we have that � can be chosen to be as small as we wantJoint source channel coding The probability of error is upper bounded by the probability that we are not in the typical set, plus the probability that we are in the typical set but we have an error Thus, we have that for any upper bound � on the probability of error, there exists a large enough n such that the probability of error is upper bounded by � Note: the probability of error from not be-ing in the typical set may be higher or lower than that from incorrect decoding Thus, we have shown the forward part of the theorem: for any V1, . . . , Vn that satisfies the AEP, there exists a source code and a channel code such that Pn 0 as n → ∞ e → Moreover, the source coding and the channel coding may be done independently�Converse to the joint source channel coding theorem We need to show that a probability of error bounded that converges 0 as n → ∞ implies that H(V ) < C Let us use Fano’s inequality: H(Vn|Vn) ≤ 1 + nP n log(|V|)e Let us now use the definition of entropy rate: The entropy rate of a stochastic process V is lim n→∞ 1 n H(V n) if it exists Equivalently, the entropy rate of a stochas-tic process is n 1H(Vn|V�n) + n 1I(Vn; V�n)Converse to the joint source channel coding theorem Hence, we have that 1 1 H(V ) ≤ n (1 + nP n log(|V|)) + nI(Vn; V�n)e 1 1 = n + Pen log(|V|) + nI(Vn; V�n) 1 1 ≤ n + Pn log(|V|) + nI(X; Y )e from DPT 1 ≤ n + Pn log(|V|) + Ce since Pn 0, it must be that H(V ) ≤ Ce →for the above to be true for all nRobustness of joint source channel coding theorem and its converse The joint channel source coding theorem and its converse hold under very general conditions: - memory in the input - memory in channel state - multiple access - but not broadcast conditionsA brain teaserMIT OpenCourseWarehttp://ocw.mit.edu 6.441 Information Theory Spring 2010 For information about citing these materials or our Terms of Use, visit:


View Full Document

MIT 6 441 - Lecture Notes

Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?