Unformatted text preview:

LECTURE 12 Last time: Strong coding theorem • Revisiting channel and codes • Bound on probability of error • Error exponent • Lecture outline Error exponent behavior • Expunging bad codewords • Error exponent positivity • Strong coding theorem •� Last time Ecodebooks[Pe,m] ≤ 2−N(E0(ρ,PX (x))−ρR) for E0(ρ, PX(x)) ⎡⎣⎤⎦ 1+ρ 1� PX(x)PY X (yix)1+ρ− log = | ⎛ ⎜⎝ | y xN We need to: get rid of the expectation over codes • by throwing out the worst half of the codes Show that the bound behaves well (ex-• ponent is −Nα for some α > 0) Relate the bound to capacity • ⎞ ⎟⎠Error exponent Define Er(R) = max0≤ρ≤1 maxPX (E0(ρ, PX(x)) −ρR) then Ecodebooks[Pe,m] ≤ 2−NEr(R) Ecodebooks,messages[Pe] ≤ 2−NEr(R) For a BSC:Expunge codewords The new Pe,m is bounded as follows: Pe,m 2.2−NEr(max0≤ρ≤1 maxPX (E0(ρ,PX (x))−ρlog(2M)))= N = 2.2−NEr(max0≤ρ≤1 maxPX (E0(ρ,PX (x))−Nρ −ρlog(M)))N 4.2−NEr(max0≤ρ≤1 maxPX (E0(ρ,PX (x))−ρlog(M )))≤ 4.2−NEr(R) N = Now we must consider positivity. Let PX(x) be such that I(X; Y ) > 0, we’ll show that the behavior of Er is:Error exponent positivity We have that: 1. E0(ρ, PX(x)) > 0, ∀ρ > 0 2. I(X; Y ) ≥ ∂E0(ρ,PX (x)) > 0, ∀ρ > 0∂ρ 3. ∂2E0(ρ,PX (x)) ≤ 0, ∀ρ > 0 ∂ρ2 We can check that I(X; Y ) = ∂E0(ρ,PX (x)) ∂ρ |ρ=0 then showing 3 will establish the LHS of 2 Showing the RHS of 2 will establish 1 Let us show 3Error exponent positivity To show concavity, we need to show that ∀ρ1, ρ2 ∀θ ∈ [0, 1] E0(ρ3, PX(x)) ≥ θE0(ρ1, PX(x)) + (1 − θ)E0(ρ2, PX(x)) for ρ3= θρ1+ θρ2 We shall use the fact that θ(1+ρ1) (1−θ)(1+ρ2) + = 1 1+ρ3 1+ρ3 and H¨older’s inequality: �1 �x �1 �1−x �j cjdj ≤ �j cjx �j cj 1−xLet us pick θ(1+ρ3) θ cj = PX(j) 1+ρ3 PY X(k j)1+ρ3 (1−θ)(1+rho|2) |1−θ dj = PX(j) 1+ρ3 PY X(k j)1+ρ3 ||θ(1 + ρ1) x = 1 + ρ3� Error exponent positivity Proof continued 1 PX(j)PY X(k j)1+ρ3 ||j∈X θ(1+ρ1) 1+ρ3 ≤ × ⎛⎝⎛⎝⎞⎠ 1+ρ1� PX(j)PY |X(k|j)1 j∈X (1−θ)(1+ρ2) 1+ρ3 ⎞⎠ 1+ρ2� PX(j)PY |X(k|j)1 j∈X ⇒ ⎛⎝ ⎞⎠ 1+ρ3 ||1+ρ3� PX(j)PY X(k j)1 j∈X ⎛⎝⎛⎝≤ × ⎞⎠ θ(1+ρ1) 1+ρ1� PX(j)PY |X(k|j)1 j∈X 1+ρ2� PX(j)PY |X(k|j)1 j∈X ⎞⎠ (1−θ)(1+ρ2)� � Error exponent positivity Proof continued ⎛⎝ ⎞⎠ 1+ρ3 ||1+ρ3� PX(j)PY X(k j)1 ⇒ k∈Y j∈X ⎛⎝ ⎞⎠ θ(1+ρ1) 1+ρ1� PX(j)PY X(k|j)1 ≤ × | k∈Y � PX(j)PY |X(k|j)1+1 ρ2 ⎛⎝j∈X ⎞⎠ (1−θ)(1+ρ2) j∈X �⎣⎡⎢k∈Y ⎤ ⎥⎦ θ ⎛⎝ ⎞⎠ (1+ρ1) 1+ρ1� PX(j)PY X(k|j)1 ≤ | j∈X ⎡ ⎢⎣ ⎤ ⎥⎦ (1−θ) ⎛⎝ ⎞⎠ (1+ρ2) ||1+ρ2� PX(j)PY X(k j)1 × j∈X� � Error exponent positivity Proof continued k∈Y ⎛ ⎜⎝⎞ ⎟⎠ ⎛⎝ ⎞⎠ 1+ρ3 1+ρ3� PX(j)PY X(k j)1 − log | | j∈X 1+ρ1� PX(j)PY X(k j)1⎛⎝⎞ ⎟⎠ ⎞⎠ (1+ρ1) ≥ −θ log ⎛ ⎜⎝ | | k∈Y⎛⎝1+ρ2� PX(j)PY X(k j)1 j∈X ⎞ ⎟⎠ ⎞⎠ (1+ρ2) − (1 − θ) | ⎛ ⎜⎝ | j∈X ⇒ E0(ρ3, PX) ≥ θE0(ρ1, PX) + (1 − θ)E0(ρ2, PX) so E0 is concave!Error exponent positivity Proof continued Hence, extremum, if it exists, of E0(ρ, PX)−∂E0(ρ,PX )ρR over ρ occurs at ∂ρ = R, which implies that ∂E0(ρ,PX ) ∂E0(ρ,PX )= I(X; Y )∂ρ |ρ=1 ≤ R ≤ |ρ=0∂ρ is necessary for Er(R, PX) = max0≤ρ≤1[E0(ρ, PX)−ρR] to have a maximum We have now placed mutual information somewhere in the expression Critical rate is RCR is defined as ∂E0(ρ,PX ) ∂ρ |ρ=1Error exponent positivity Proof continued From ∂E0(ρ,PX )= R∂ρ we obtain ∂R = ∂2E0(ρ,PX ) ∂ρ ∂ρ2 Hence ∂R < 0, R decreases monotonically ∂ρ from C to RCR We can write Er(R, PX) = E0(ρ, PX) − ρ∂E0(ρ,PX ) ∂ρ for Er(R, PX) = E0(ρ, PX) − ρR (ρ allows parametric relation) then ∂Er(R,PX ) ∂2E0(ρ,PX ) = −ρ > 0∂ρ ∂ρ2Error exponent positivity Proof continued Taking the ratio of the derivatives, ∂Er(R,PX )= ∂R −ρ Er(R, PX) is positive for R < C Moreover ∂2Er(R,PX )= −[∂2E0(ρ,PX )]−1 > 0 ∂R2 ∂ρ2 thus Er(R, PX) is convex and decreasing in R over RCR < R < CError exponent positivity Proof continued Taking Er(R) = maxPX Er(R, PX) is the maximum of functions that are con-vex and decreasing in R and so is also con-vex For the PX that yields capacity, Er(R, PX) is positive for R < C So we have positivity of error exponent for 0 < R < C and capacity has been intro-duced This completes the coding theorem Note: there are degenerate cases in which ∂Er(R,PX )= constant and ∂2Er(R,PX )= 0 ∂ρ ∂2ρ when would that happen?MIT OpenCourseWarehttp://ocw.mit.edu 6.441 Information Theory Spring 2010 For information about citing these materials or our Terms of Use, visit:


View Full Document

MIT 6 441 - Lecture Notes

Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?