DOC PREVIEW
SJSU CS 146 - Data Compression

This preview shows page 1-2-3-4-5-6 out of 18 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

PowerPoint PresentationSlide 2IntroductionSlide 4Compression PrinciplesSlide 6Slide 7Slide 8Huffman AlgorithmSlide 10Slide 11Slide 12Slide 13Slide 14Slide 15Slide 16Static Huffman CodingSlide 181Data CompressionHae-sun JungCS146 Dr. Sin-Min LeeSpring 200423IntroductionCompression is used to reduce the volume of information to be stored into storages or to reduce the communication bandwidth required for its transmission over the networks45Compression PrinciplesEntropy EncodingRun-length encodingLossless & Independent of the type of source informationUsed when the source information comprises long substrings of the same character or binary digit (string or bit pattern, # of occurrences), as FAX e.g) 000000011111111110000011……  0,7 1, 10, 0,5 1,2……  7,10,5,2……6Compression PrinciplesEntropy EncodingStatistical encodingBased on the probability of occurrence of a patternThe more probable, the shorter codeword“Prefix property”: a shorter codeword must not form the start of a longer codeword7Compression PrinciplesHuffman EncodingEntropy, H: theoretical min. avg. # of bits that are required to transmit a particular stream H = -Σ i=1 n Pi log2Pi where n: # of symbols, Pi: probability of symbol iEfficiency, E = H/H’ where, H’ = avr. # of bits per codeword = Σ i=1 n Ni PiNi: # of bits of symbol i8E.g) symbols M(10), F(11), Y(010), N(011), 0(000), 1(001) with probabilities 0.25, 0.25, 0.125, 0.125, 0.125, 0.125H’ = Σ i=1 6 Ni Pi = (2(20.25) + 4(30.125)) = 2.5 bits/codewordH = -Σ i=1 6 Pi log2Pi = - (2(0.25log20.25) + 4(0.125log20.125)) = 2.5E = H/H’ =100 % 3-bit/codeword if we use fixed-length codewords for six symbols9Huffman AlgorithmMethod of construction for an encoding tree•Full Binary Tree Representation•Each edge of the tree has a value,(0 is the left child, 1 is the right child)•Data is at the leaves, not internal nodes•Result: encoding tree•“Variable-Length Encoding”10Huffman Algorithm•1. Maintain a forest of trees•2. Weight of tree = sum frequency of leaves•3. For 0 to N-1–Select two smallest weight trees–Form a new tree11•Huffman coding•variable length code whose length is inversely proportional to that character’s frequency•must satisfy nonprefix property to be uniquely decodable•two pass algorithm–first pass accumulates the character frequency and generate codebook–second pass does compression with the codebook12•create codes by constructing a binary tree1. consider all characters as free nodes2. assign two free nodes with lowest frequency to a parent nodes with weights equal to sum of their frequencies3. remove the two free nodes and add the newly created parent node to the list of free nodes4. repeat step2 and 3 until there is one free node left. It becomes the root of treeHuffman coding13•Right of binary tree :1•Left of Binary tree :0•Prefix (example)–e:”01”, b: “010” –“01” is prefix of “010” ==> “e0”•same frequency : need consistency of left or right14•Example(64 data)•R K K K K K K K•K K K R R K K K•K K R R R R G G•K K B C C C R R•G G G M C B R R•B B B M Y B B R•G G G G G G G R•G R R R R G R R15•Color frequency Huffman code•=================================• R 19 00• K 17 01• G 14 10• B 7 110• C 4 1110• M2 11110• Y 1 111111617Static Huffman CodingHuffman (Code) TreeGiven : a number of symbols (or characters) and their relative probabilities in priorMust hold “prefix property” among codesSymbol Occurrence A 4/8 B 2/8 C 1/8 D 1/8Symbol CodeA 1 B 01C 001D 00041 + 22 + 13 + 13 = 14 bits are required to transmit“AAAABBCD”0 1DABC0 10 1842Leaf nodeRoot nodeBranch nodePrefix Property !18The


View Full Document

SJSU CS 146 - Data Compression

Download Data Compression
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Data Compression and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Data Compression 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?