# WUSTL ESE 523 - ESE523Lec122013 (13 pages)

Previewing pages 1, 2, 3, 4 of 13 page document
View Full Document

## ESE523Lec122013

Previewing pages 1, 2, 3, 4 of actual document.

View Full Document
View Full Document

## ESE523Lec122013

55 views

Pages:
13
School:
Washington University in St. Louis
Course:
Ese 523 - Information Theory

Unformatted text preview:

ESE 523 Information Theory Lecture 14 Joseph A O Sullivan Samuel C Sachs Professor Electrical and Systems Engineering Washington University 211 Urbauer Hall 2120E Green Hall jao wustl edu 10 15 13 J A O Sullivan ESE 523 Lecture 14 1 Chapter 8 Differential Entropy Let X be a real valued random variable X is continuous if F x P X x is continuous Let f x be the probability density function pdf if it exists x F x f d The differential entropy is h X f x log f x dx f log f 2 10 15 13 J A O Sullivan ESE 523 Lecture 14 Examples Let X be U b a b Then a b h X b 1 1 log dx a a log a loga is negative for a 1 Let X be N m 2 Then 2 1 X m 2 h X E log 2 log e 2 2 2 1 1 1 2 log 2 log e log 2 2 e 2 2 2 10 15 13 J A O Sullivan ESE 523 Lecture 14 3 First Part of AEP Theorem Let Xi be i i d with pdf f x Then by the weak law of large numbers 1 log f X 1 X 2 X n E log f X h X n in probability if E logf X is finite 1 1 n log f X 1 X 2 X n log f X i n n i 1 E log f X h X 4 10 15 13 J A O Sullivan ESE 523 Lecture 14 Relative Entropy The relative entropy between two densities f and g is f g Integral over the support of f The mutual information I X Y is f x y I X Y f x y log dxdy f x f y D f g f log h X h X Y h Y h Y X D f x y f x f y Theorem D f g 0 with equality iff f g Proof g X g X D f g E f log log E 0 f f X f X Jensen s inequality Same support for two densities 5 10 15 13 J A O Sullivan ESE 523 Lecture 14 Clarification g X g X D f g E f log log E f f X f X g X g x Ef f x dx g x dx 1 f X x f x 0 f x x f x 0 g X log E f log g x dx 0 f X x f x 0 6 10 15 13 J A O Sullivan ESE 523 Lecture 14 Implications of Nonnegativity of Relative Entropy Corollary I X Y 0 with equality if and only if X and Y are independent Corollary Conditioning reduces entropy That is h X Y h X with equality if and only if X and Y are independent Comment Chain rules

View Full Document

## Access the best Study Guides, Lecture Notes and Practice Exams Unlocking...