DOC PREVIEW
MIT 6 454 - Study Guide

This preview shows page 1-2-3-26-27-28 out of 28 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 28 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 28 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 28 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 28 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 28 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 28 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 28 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 8, AUGUST 2010 3721Rényi Information Dimension: Fundamental Limitsof Almost Lossless Analog CompressionYihong Wu, Student Member, IEEE, and Sergio Verdú, Fellow, IEEEAbstract—In Shannon theory, lossless source coding deals withthe optimal compression of discrete sources. Compressed sensingis a lossless coding strategy for analog sources by means of mul-tiplication by real-valued matrices. In this paper we study almostlossless analog compression for analog memoryless sources in aninformation-theoretic framework, in which the compressor or de-compressor is constrained by various regularity conditions, in par-ticular linearity of the compressor and Lipschitz continuity of thedecompressor. The fundamental limit is shown to be the informa-tion dimension proposed by Rényi in 1959.Index Terms—Analog compression, compressed sensing, infor-mation measures, Rényi information dimension, Shannon theory,source coding.I. INTRODUCTIONA. Motivations From Compressed SensingTHE “Bit” is the universal currency in lossless sourcecoding theory [1], where Shannon entropy is the fun-damental limit of compression rate for discrete memorylesssources (DMS). Sources are modeled by stochastic processesand redundancy is exploited as probability is concentratedon a set of exponentially small cardinality as blocklengthgrows. Therefore, by encoding this subset, data compressionis achieved if we tolerate a positive, though arbitrarily small,block error probability.Compressed sensing [2], [3] has recently emerged as an ap-proach to lossless encoding of analog sources by real numbersrather than bits. It deals with efficient recovery of an unknownreal vector from the information provided by linear measure-ments. The formulation of the problem is reminiscent of the tra-ditional lossless data compression in the following sense.• Sources are sparse in the sense that each vector is sup-ported on a set much smaller than the blocklength. Thiskind of redundancy in terms of sparsity is exploited toachieve effective compression by taking fewer number oflinear measurements.• In contrast to lossy data compression, block error proba-bility, instead of distortion, is the performance benchmark.Manuscript received March 02, 2009; revised April 30, 2010. Date of currentversion July 14, 2010. This work was supported in part by the National ScienceFoundation under Grants CCF-0635154 and CCF-0728445. The material in thispaper was presented in part at the IEEE International Symposium on Informa-tion Theory, Seoul, Korea, July 2009 [55].The authors are with the Department of Electrical Engineering, PrincetonUniversity, Princeton, NJ 08544 USA (e-mail: [email protected];[email protected]).Communicated by H. Yamamoto, Associate Editor for Shannon Theory.Digital Object Identifier 10.1109/TIT.2010.2050803• The central problem is to determine how many compressedmeasurements are sufficient/necessary for recovery withvanishing block error probability as blocklength tends toinfinity [2]–[4].• Random coding is employed to show the existence of“good” linear encoders. In particular, when the randomprojection matrices follow certain distribution (e.g., stan-dard Gaussian), the restricted isometry property (RIP) issatisfied with overwhelming probability and guaranteesexact recovery.On the other hand, there are also significantly different ingre-dients in compressed sensing in comparison with informationtheoretic setups.• Sources are not modeled probabilistically, and the funda-mental limits are on a worst case basis rather than on av-erage. Moreover, block error probability is with respect tothe distribution of the encoding random matrices.• Real-valued sparse vectors are encoded by real numbersinstead of bits.• The encoder is confined to be linear while generally ininformation-theoretical problems such as lossless sourcecoding we have the freedom to choose the best possiblecoding scheme.Departing from the compressed sensing literature, we study fun-damental limits of lossless source coding for real-valued mem-oryless sources within an information theoretic setup.• Sources are modeled by random processes. This methodis more flexible to describe source redundancy which en-compasses, but is not limited to, sparsity. For example, amixed discrete-continuous distribution is suitable for char-acterizing linearly sparse vectors [5], [6], i.e., those with anumber of nonzero components proportional to the block-length with high probability and whose nonzero compo-nents are drawn from a given continuous distribution.• Block error probability is evaluated by averaging with re-spect to the source.• While linear compression plays an important role in our de-velopment, our treatment encompasses weaker regularityconditions.Methodologically, the relationship between our approach andcompressed sensing is analogous to the relationship betweenmodern coding theory and classical coding theory: classicalcoding theory adopts a worst case (Hamming) approach whosegoal is to obtain codes with a certain minimum distance, whilemodern coding theory adopts a statistical (Shannon) approachwhose goal is to obtain codes with small probability of failure.Likewise compressed sensing adopts a worst case model inwhich compressors work provided that the number of nonzero0018-9448/$26.00 © 2010 IEEEAuthorized licensed use limited to: Princeton University. Downloaded on July 20,2010 at 19:49:18 UTC from IEEE Xplore. Restrictions apply.3722 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 8, AUGUST 2010components in the source does not exceed a certain threshold,while we adopt a statistical model in which compressors workfor most source realizations. In this sense, almost losslessanalog compression can be viewed as an information theoreticframework for compressed sensing. Probabilistic modelingprovides elegant results in terms of fundamental limit, as wellas sheds light on constructive schemes on individual sequences.For example, not only is random coding a proof technique inShannon theory, but also a guiding principle in modern codingtheory as well as in compressed sensing.Recently there have been considerably new developmentsin using statistical signal models (e.g., mixed distributions)in compressed sensing (e.g., [5]–[8]), where reconstructionperformance is evaluated by computing the asymptotic errorprobability in the large-blocklength limit. As discussed inSection IV-B, the


View Full Document

MIT 6 454 - Study Guide

Documents in this Course
Load more
Download Study Guide
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Study Guide and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Study Guide 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?