DOC PREVIEW
Berkeley COMPSCI 294 - Notes for Lecture 17

This preview shows page 1 out of 4 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

U.C. Berkeley Handout N17CS294: Pseudorandomness and Combinatorial Constructions October 27, 2005Professor Luca Trevisan Scribe: Sebastien RochNotes for Lecture 17In the previous lecture, we introduced the notion of a randomness extractor—a procedure whichextracts “uniform” randomness from a weak random source and a small number of truly randombits. In this lecture, we give general impossibility results for randomness extraction. The resultsare taken from [NZ96, RT00].1 Shannon’s EntropyWe seek to characterize the class of random sources for which extraction is possible. We start witha discussion of Shannon’s Entropy. We explain, in particular, why this notion is not appropriatefor our purposes.Definition 1 (Shannon’s Entropy) Let X be a random variable. Then Shannon’s Entropy isdefined asH(X)=x : P[X=x]=0P[X = x]log1P[X = x]= Ex∼Xlog1P[X = x] .An interpretation of H(X) is as follows: it counts the number of truly random bits needed tosample from X. Indeed, if t bits are used to encode a particular value a, then we needP[X = a] ≥12t⇒ t ≥ log1P[X = a].Therefore, if Radenotes the number of random bits used for a,thenwehaveEa∼X[Ra] ≥ Ea∼Xlog1P[X = a].However, Shannon’s Entropy measures only the amount of randomness needed, not how it is “dis-tributed” over all possible values. To see why this is a problem, consider the following randomsource X over {0, 1}n. Suppose X is such thatP[X = 0]=1−12100,where 0 =(0,...,0), and for all a = 0,P[X = a]=1210012n− 1∼12n+100.Then the Shannon Entropy isH(X)=1 −12100log11 −12100+a=01210012n− 1log11210012n−1=Ω(n).1According to this measure, the source X has high entropy. However, suppose the number of trulyrandom bits used by an extractor is t = O(log n). Denote Ext : {0, 1}n×{0, 1}t→{0, 1}mtheextractor (with m  t), and Umthe uniform distribution on {0, 1}m. Then, because t = O(log n),the distribution of Y =Ext[X, Um] is concentrated on a polynomial number of strings z withP[Y = z] ≥1poly(n).It is easy to see that Y cannot be close to uniform. Indeed, suppose Y is12-close to uniform. Sortthe outputs of Y in increasing order of their probability under Y .LetS be the smallest set of“most likely” outputs such that P[Y ∈ S] ≥ .51. Then we must have P[Um∈ S] ≥ 0.01. But thatimplies |S|≥0.01 · 2m, i.e. S has to be exponentially large.We have described a random source with high Shannon entropy for which randomness extractionis not possible. Therefore, Shannon’s Entropy is not the right notion of entropy for our purposes.2 Min-EntropyIt follows from the previous discussion that for a distribution to be close to uniform, it must output(most of the times) values that have exponentially small probability. Because our seed has rathershort length, this kind of small probability has to come from the source. To capture this, weintroduce the notion of min-entropy.Definition 2 The min-entropy of X isH∞(X) ≡ minx:P[X=x]log1P[X = x].If H∞(X)=k, then for all a, P[X = a] ≤ 2−k.Let m be the size of the output. We will show that extraction is possible only if the min-entropyof the source is close to m. In fact, we will show that there is a fixed extractor for all sources withmin-entropy close to m. In this lecture, we focus on the former result. We first define the notionof extractor.Definition 3 The function Ext : {0, 1}n×{0, 1}d→{0, 1}mis a (k,ε)-extractor if ∀X : H∞(X) ≥k,Ext(X, Ud) − UmSD≤ ε.Trade-off. For n, k fixed (i.e. fixed physical source), we would like d small (i.e. O(log n)), m large(i.e. ∼ k), and ε small (i.e. constant).Existence. It is known (non-constructively) that (k,ε)-extractors exist withm = k + d − 2logε−1− O(1),andd =log(n − k)+2logε−1+ O(1).23 Lower BoundIn this section, we prove the following lower bound on extraction.Proposition 4 For every (k, ε)-extractor with ε<1/2 and m>d+2, the following hold:m ≤ k + d − 2logε−1+ O(1),andd ≥ log(n − k)+2logε−1− O(1).We actually prove a slightly weaker bound.Proof:(Sketch) Let Ext : {0, 1}n×{0, 1}d→{0, 1}mbe a (k, ε)-extractor. Construct a bipartitemulti-graph with 2nnodes on the left side (corresponding to sequences in {0, 1}n)and2mnodeson the right side (corresponding to sequences in {0, 1}m). There is an edge between x and y iff ∃zs.t. Ext(x, z)=y.Let S ⊆{0, 1}n, |S|≥2k,andT ⊆{0, 1}m.Pickv uniformly at random in S and w uniformly atrandom over the neighbours of v. Then by definition of a (k, ε)-extractor, it holds thatP[w ∈ T ] −|T |2m≤ ε.Now assume that, in fact, we pick T ⊆{0, 1}mof size ε2m+ 1 uniformly at random, and letS ⊆{0, 1}nbe the left neighbours of T .Claim 5 It holds that |S|≥2n− 2k.Proof: (Sketch) By contradiction, assume |Sc| > 2k. Pick a uniform random neighbour of Sc.The probability that it falls in T is 0. This contradicts the definition of (k,ε)-extractor. Fix v ∈{0, 1}n. We compute the probability that the neighbourhood of v hits T when T is chosenin the following way: each element in {0, 1}mis picked with probability ε independently. (Notethat this choice of T is not quite what we want. This can be fixed. The details are left out.) WehavePT[no neighbour of v is in T ] ≥ (1 − ε)2d.Therefore,2k≥ ET[number of vertices on the left with no neighbour in T ]=v∈{0,1}nPT[no neighbour of v is in T ]≥ 2n(1 − ε)2d∼ 2ne−ε2d.This impliesk ≥ n − ε2dO(1),ord ≥ log ε−1+log(n − k) − O(1).3General Proof. To obtain the factor of 2 in front of log ε−1, one has to consider the following:look at all left vertices such thatnumber of edges from v to T2d<|T |2m− ε. (1)There are at most 2ksuch vertices (by contradiction). Pick T ⊆{0, 1}mby including each elementin {0, 1}mwith probability 1/2 independently. Then by the Central Limit Theorem, it holds thatthe probability that a fixed vertex in {0, 1}nsatisfies (1) is at least 1/eΩ(ε22d). The bound follows.In the next class, we will see explicit constructions of extractors.References[RT00] J. Radhakrishnan and A. Ta-Shma, Bounds for dispersers, extractors, and depth-two su-perconcentrators, SIAM J. Discrete Math., 13(1):2–24, 2000.[NZ96] N. Nisan and D. Zuckerman, Randomness is Linear in Space, JCSS, 52(1):43–52,


View Full Document

Berkeley COMPSCI 294 - Notes for Lecture 17

Documents in this Course
"Woo" MAC

"Woo" MAC

11 pages

Pangaea

Pangaea

14 pages

Load more
Download Notes for Lecture 17
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Notes for Lecture 17 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Notes for Lecture 17 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?