# WUSTL ESE 523 - ESE523Lec122013 (13 pages)

Previewing pages*1, 2, 3, 4*of 13 page document

**View the full content.**## ESE523Lec122013

Previewing pages
*1, 2, 3, 4*
of
actual document.

**View the full content.**View Full Document

## ESE523Lec122013

0 0 74 views

- Pages:
- 13
- School:
- Washington University in St. Louis
- Course:
- Ese 523 - Information Theory

**Unformatted text preview:**

ESE 523 Information Theory Lecture 14 Joseph A O Sullivan Samuel C Sachs Professor Electrical and Systems Engineering Washington University 211 Urbauer Hall 2120E Green Hall jao wustl edu 10 15 13 J A O Sullivan ESE 523 Lecture 14 1 Chapter 8 Differential Entropy Let X be a real valued random variable X is continuous if F x P X x is continuous Let f x be the probability density function pdf if it exists x F x f d The differential entropy is h X f x log f x dx f log f 2 10 15 13 J A O Sullivan ESE 523 Lecture 14 Examples Let X be U b a b Then a b h X b 1 1 log dx a a log a loga is negative for a 1 Let X be N m 2 Then 2 1 X m 2 h X E log 2 log e 2 2 2 1 1 1 2 log 2 log e log 2 2 e 2 2 2 10 15 13 J A O Sullivan ESE 523 Lecture 14 3 First Part of AEP Theorem Let Xi be i i d with pdf f x Then by the weak law of large numbers 1 log f X 1 X 2 X n E log f X h X n in probability if E logf X is finite 1 1 n log f X 1 X 2 X n log f X i n n i 1 E log f X h X 4 10 15 13 J A O Sullivan ESE 523 Lecture 14 Relative Entropy The relative entropy between two densities f and g is f g Integral over the support of f The mutual information I X Y is f x y I X Y f x y log dxdy f x f y D f g f log h X h X Y h Y h Y X D f x y f x f y Theorem D f g 0 with equality iff f g Proof g X g X D f g E f log log E 0 f f X f X Jensen s inequality Same support for two densities 5 10 15 13 J A O Sullivan ESE 523 Lecture 14 Clarification g X g X D f g E f log log E f f X f X g X g x Ef f x dx g x dx 1 f X x f x 0 f x x f x 0 g X log E f log g x dx 0 f X x f x 0 6 10 15 13 J A O Sullivan ESE 523 Lecture 14 Implications of Nonnegativity of Relative Entropy Corollary I X Y 0 with equality if and only if X and Y are independent Corollary Conditioning reduces entropy That is h X Y h X with equality if and only if X and Y are independent Comment Chain rules hold 7 10 15 13 J A O Sullivan ESE 523 Lecture 14 Typical Sets Typical set with respect to f x n A 1 n x1 x2 xn R log f x1 x2 xn h X n n f x1 x2 xn f xi i 1 Volume of a set A vol A dx1dx2 dxn A 8 10 15 13 J A O Sullivan ESE 523 Lecture 14 AEP Pr X A 1 for all n sufficient ly large n n vol A 2 n n h X for all n vol A n 1 2 n h X for all n sufficient ly large 9 10 15 13 J A O Sullivan ESE 523 Lecture 14 Gaussian Typical Set n 1 n n A X x R log f xi h X n i 1 n If f x 1 2 2 e x2 2 2 1 then h X log 2 2 e 2 n 1 n 2 n 2 A X x R xi 1 n i 1 2 2 where 1 log e Gaussian AEP Part 1 X i w f x i i d n 1 n 2 P X i w 2 1 for all n large enough n i 1 P X n w A n X for all n large enough October 11 13 18 2011 J A O Sullivan ESE 523 Lecture 13 15 Typical set is a spherical shell Take a sphere of square radius n 2 and expand the surface The outer bound has square radius n 2 The inner bound has square radius n 2 10 Diversion on Shapes of Typical Sets n 1 n n A X x R log f xi h X n i 1 n If f x 1 x e for x 0 then h X log e n 1 n n A X x R xi 1 n i 1 n where 1 log e This is an 1 expansion of a probability simplex x 1 If f x e then h X 1 log e 2 n 1 n n n A X x R xi 1 n i 1 where 1 In each orthant there is a probability simplex multiplied log e October 18 2011 by11 13 and expanded by J 1 A O Sullivan ESE 523 Lecture 13 15 11 Differential Entropy Table http en wikipedia org wiki Differential entropy 12 10 15 13 J A O Sullivan ESE 523 Lecture 14 13 10 15 13 J A O Sullivan ESE 523 Lecture 14

View Full Document