DOC PREVIEW
UT CS 361 - Foundations of Computer Security Lecture 33- Entropy II

This preview shows page 1 out of 2 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Foundations of Computer SecurityLecture 33: Entropy IIDr. Bill YoungDepartment of Computer SciencesUniversity of Texas at AustinLecture 33: 1 Entropy IIExample RevisitedGiven: an unbalanced coin that is three times more likely to yield ahead than a tail.Solution: There are two possible outcomes:Result ProbH 3/4T 1/4The entropy is computed as follows:h = −(3/4 × log 3/4 + 1/4 × log 1/4) ≈ 0.811Upshot: It’s theoretically impossible to encode this language usingless than 0.811 bits per symbol (on average).But how would you ever do better than 1 bit / symbol?Lecture 33: 2 Entropy IIExample RevisitedInstead of taking single flips as our “experiments,” let’s take pairs(call them “2flips”) and code as follows:Result Prob. CodeHH 9/16 0HT 3/16 10TH 3/16 110TT 1/16 111Suppose we flip the coin 32 times; that’s 16 2flips. In an averagerun, we’d expect: HH to appear 9 times, HT and TH to eachappear 3 times, and TT to appear once. Why?Lecture 33: 3 Entropy IIExample RevisitedGiven 32 flips (16 2flips), we could expect:Result Count Code BitsHH 9 0 9HT 3 10 6TH 3 110 9TT 1 111 3Total: 27For the na¨ıve encoding, using 1 bit / flip, we’d expect to use 32bits. Our efficiency is 27/32 ≈ 0.844, which is not a badapproximation of the entropy (0.811).Could we do better? Sure, just use 3flips, 4flips, etc. The entropyis the limit of this process.Lecture 33: 4 Entropy IITest Your UnderstandingSuppose you have a six-sided die that is unbalanced such that 1and 2 are equally likely; 3 and 4 are equally likely; and 5 and 6 areequally likely. However, the die rolls 1 twice as often as 3, and rolls3 three times as often as 5.1What is the “naive” encoding for this language?2What is the entropy of this language?3Find an encoding that is more efficient than the naiveencoding.4Give a convincing argument that your encoding is moreefficient than the naive encoding.Hint: There’s no need to encode sequences of rolls.Lecture 33: 5 Entropy IILessonsComputing the entropy of a language provides a bound on theefficiency of any encoding.But, finding an efficient encoding requires ingenuity.Next lecture: Fundamental TheoremsLecture 33: 6 Entropy


View Full Document

UT CS 361 - Foundations of Computer Security Lecture 33- Entropy II

Documents in this Course
Load more
Download Foundations of Computer Security Lecture 33- Entropy II
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Foundations of Computer Security Lecture 33- Entropy II and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Foundations of Computer Security Lecture 33- Entropy II 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?