DOC PREVIEW
UT CS 361 - Entropy Odds and Ends

This preview shows page 1-2 out of 5 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Foundations of Computer SecurityLecture 36: Entropy Odds and EndsDr. Bill YoungDepartment of Computer SciencesUniversity of Texas at AustinLecture 36: 1 Entropy Odds and EndsEntropy AsideRecall that information content of a message depends on the stateof knowledge of the receiver. Hence, entropy is relative to aparticular observer.Consider the entropy of the contents of an envelope marked “BestPicture” at the Academy Awards (assuming 5 nominees):If all were equally likely to win, the entropy would belog 5 ≈ 2.322.For everyone who knows that the odds aren’t even, it’s less,though hard to compute.For the auditors who stuffed the envelope, it’s 0 since theyhave no uncertainty.Often, prior probabilities are impossible to compute.Lecture 36: 2 Entropy Odds and EndsEntropy and RandomnessNote that entropy can be used to measure the amount of“redundancy” in the encoding. If the information content of amessage is equal to the length of the encoded message, there is noredundancy.Some sources define a random string as one that cannot berepresented any more efficiently. (I.e., no compression is possible.)Lecture 36: 3 Entropy Odds and EndsFinding a CodingHuffman coding is guaranteed to find an efficient code for a givenlanguage assuming you know the probabilities of language units.In fact, it always uses less than one bit per symbol more than theentropy, which is extremely efficient.Lempel-Ziv is an “adaptive coding” algorithm used in manycommercial text compression utilities. It builds an encoding on thefly according to the strings it encounters.Lempel-Ziv is asymptotically optimal. That is, as the text lengthtends to infinity, the compression approaches optimal.Lecture 36: 4 Entropy Odds and EndsLessonsThe information content of a message is relative to the stateof knowledge of an observer.If an encoding’s efficiency matches the entropy, there is noredundancy to compress out.Huffman coding and the Lempel Ziv algorithms both givehighly efficient codes.Next lecture: CryptographyLecture 36: 5 Entropy Odds and


View Full Document

UT CS 361 - Entropy Odds and Ends

Documents in this Course
Load more
Download Entropy Odds and Ends
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Entropy Odds and Ends and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Entropy Odds and Ends 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?