WUSTL ESE 523 - ESE523Lect2-6(1) (77 pages)

Previewing pages 1, 2, 3, 4, 5, 36, 37, 38, 39, 40, 73, 74, 75, 76, 77 of 77 page document View the full content.
View Full Document

ESE523Lect2-6(1)



Previewing pages 1, 2, 3, 4, 5, 36, 37, 38, 39, 40, 73, 74, 75, 76, 77 of actual document.

View the full content.
View Full Document
View Full Document

ESE523Lect2-6(1)

68 views


Pages:
77
School:
Washington University in St. Louis
Course:
Ese 523 - Information Theory
Unformatted text preview:

ESE 523 Information Theory Joseph A O Sullivan Samuel C Sachs Professor Electrical and Systems Engineering Washington University 211 Urbauer Hall 2120E Green Hall 314 935 4173 jao wustl edu 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 1 Binary Entropy Function 1 Outline p 1 p H p p log p 1 p log 1 p Entropy 0 8 0 6 0 4 0 2 0 0 0 2 0 4 0 6 Probability of One 0 8 1 H X p x log p x Entropy x X H X Y p x y log p x y Joint Entropy x X y Y Conditional Entropy H X Y p x y log p x y Relative Entropy D p q p x log x X y Y x X Mutual Information I X Y x X y Y 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 p x q x p x y p x y log p x p y 2 Notation X Random variable R V Alphabet discrete X x1 x2 xn Probability mass function P X xi pi p i p xi pi 0 log log2 p x X i 1 Biased coin flip X h t p x p 1 p Two dice X 2 3 4 5 6 7 8 9 10 11 12 p x 1 2 3 4 5 6 5 4 3 2 1 36 1 Powerball 09 3 13 59 1 p x 39 195 249 054 5 3 J A O Sullivan ESE 523 Lecture 2 6 Measure of Information Entropy The entropy of X H X is H X p x log p x x X Units are bits Measure of uncertainty of a R V H X E log p X 1 E log p X 09 3 13 the eerily self referential expectation Cover and Thomas p 14 J A O Sullivan ESE 523 Lecture 2 6 4 Entropy Example 1 Deterministic R V p x i 1 and p x j 0 j i H X 0 No information gained from observing the outcome 1 log 1 1 0 0 0 log 0 lim log 0 0 Proof uses l Hopital s rule log 1 1 lim log lim lim log e 0 2 0 0 0 1 1 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 5 Entropy Example 2 Flip a fair coin 1 X h t p h p t 2 1 1 1 1 H X log log 2 2 2 2 1 bit 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 6 Entropy Example 3 Flip a fair coin n times X h h h h h t t t t 1 p xi n i 1 2 2n 2 2n 1 1 H X n log n 2 i 1 2 n bits 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 7 Entropy Example 4 Powerball or any other uniform distribution X x1 x2 xM 1 p xi for all i M M 1 H X log M log M i 1 M H Powerball log 195249054 27 5407 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 8 Entropy Example 5 Flip a fair coin 2 times and add the number of heads 1 1 X 0 1 2 p 0 p 2 p 1 4 2 1 1 1 1 1 1 H X log log log 4 4 2 2 4 4 3 bits 2 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 9 Properties and Remarks Entropy is the expected number of binary questions one needs to ask to determine the value of a R V Last example on average how many yes no questions to determine outcome Answer 1 5 questions Entropy is nonnegative Base change other units Hb X E logb p x logb a E loga p x nats for base e 1 nat log2e bits 1 44 bits J A O Sullivan ESE 523 Lecture 2 6 09 3 13 10 Binary Entropy Function X 0 1 p 1 p H X p log p 1 p log 1 p H p Binary Entropy Function 1 Entropy 0 8 0 6 0 4 0 2 0 0 09 3 13 0 2 0 4 0 6 Probability of One 0 8 1 J A O Sullivan ESE 523 Lecture 2 6 11 Matlab Function entropy m function ent entropy p np size p if length np 1 p reshape p prod np 1 end ip find and p 0 p 1 pp p ip sum p ip hhp pp log2 pp ent sum hhp 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 12 Matlab Function plotbinentropy m p 0 0025 0 0025 1 0 0025 onep 1 p ent p log2 p onep log2 onep ent 0 ent 0 p 0 p 1 figure1 figure axes1 axes FontSize 16 Parent figure1 title axes1 Binary Entropy Function xlabel axes1 Probability of One ylabel axes1 Entropy box axes1 on hold axes1 all plot p ent LineWidth 2 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 13 Example 6 Entropy as Answer to Combinatorics Question Lecture 1 Assume X m There are n trials How many ways are there to get k1 k2 km of the elements k1 k2 km n Operational role of entropy for a combinatorics question n n k k k 1 2 m k1 k2 km 2 k k k k k k n 1 log 1 2 log 2 m log m o n n n n n n n k k k nh 1 2 m no n n n n 2 Theorem n 1 log h p1 p2 pm n n k1k2 km km k1 k2 if p1 p2 pm n n n n n n 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 14 Definitions The joint entropy of R V s X and Y is H X Y p x y log p x y x X y Y E log p X Y The conditional entropy of Y given X is H Y X p x p y x log p y x x X y Y E log p Y X 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 15 Entropies Theorem H X Y H X H Y X Proof H Y H X Y p x y p x p y x logp x y logp x logp y x p x y logp x y p x y logp x p x y logp y x x y x y x y p x y logp x y p x logp x p x y logp y x x y x x y H X Y H X H Y X 09 3 13 J A O Sullivan ESE 523 Lecture 2 6 16 Definition The relative entropy between probability distribution functions p x and …


View Full Document

Access the best Study Guides, Lecture Notes and Practice Exams

Loading Unlocking...
Login

Join to view ESE523Lect2-6(1) and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view ESE523Lect2-6(1) and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?