DOC PREVIEW
Berkeley COMPSCI 70 - n12

This preview shows page 1-2 out of 7 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

EECS 70 Discrete Mathematics and Probability TheorySpring 2014 Anant Sahai Note 12Probability Examples Based on CountingWe will now look at examples of random experiments and their corresponding sample spaces, along withpossible probability spaces and events. As we do so, we’ll add a couple more tools to our repertoire: (1) Stir-ling’s Approximation; and (2) How to combine multiple independent experiments into a single probabilityspace.Fair Coin Flipping and Stirling’s ApproximationSuppose we have an unbiased coin, and our experiment consists of flipping the coin 4 times. The samplespace Ω consists of the sixteen possible sequences of H’s and T’s.For a fair coin, the probabilities are assigned uniformly; the probability of each sample point is116.Consider event A2: the event that there are exactly two heads. The probability of any particular outcomewith two heads (such as HT HT ) is116. So the key is to count |A2|. How many such outcomes are there?There are42= 6 ways of choosing the positions of the heads, and these choices completely specify thesequence. So Pr[A2] =616=38.More generally, if we flip the coin n times, we get a sample space Ω of cardinality 2n. The sample pointsare all possible sequences of n H’s and T’s.Now consider the event Arthat we get exactly r H’s when we flip the coin n times. This event consists ofexactlynrsample points. Each has probability12n. So the probability of this event, P[Ar] =(nr)2n.It is interesting to observe that as n gets larger, the denominator above gets larger exponentially in n. Butthe number of different Aris just n + 1 — and so grows only linearly with n. So how does the probabilitydistribute itself across the different Ar? That is what the fair coin tossing examples were showing youexperimentally in the first probability lecture note.But to get an analytic grasp on what is going on, we’re going to have to find a way to translate from theworld of factorials (that we see in the numeratornr) and the world of exponentials (that we see in thedenominator).The key tool for doing this is the famous Stirling’s Approximation:n! ≈√2π n(ne)n. (1)In practice, this is a very good approximation. (Plot it to see this!) It is also very surprising at first glance. Tounderstand something as simple as factorial — which just involves the multiplication of wholesome integersin order — we need to invoke three irrational numbers√2, π, e that seem to come from nowhere. Whyshould multiplying numbers in order have anything to do with the length of a diagonal of a square, the ratioof the circumference of a circle to its diameter, and the base of natural logarithms?!? This kind of linkageacross very different areas is a part of the deep beauty of mathematics.EECS 70, Spring 2014, Note 12 1We can use Stirling’s approximation to get a better intuitive handle onnr. To help us simplify, let’s defineq =rn.nr=n!r!(n −r)!(2)≈√2π n(ne)n√2π r(re)rp2π (n −r)(n−re)n−r(3)=√nnn√2πpr(n −r)rr(n −r)n−r(4)=nn√2πpnq(1 −q)(qn)qn((1 −q)n)(1−q)n(5)=1√2πpnq(1 −q)qqn(1 −q)(1−q)n(6)=1√2πpnq(1 −q)(1qq(1 −q)(1−q))n(7)Notice that last term is actually something exponential in n. For example, plug in q =12and you will get1(q)q(1−q)(1−q)= 2.This gives us the curious observation that about√2√π nof all possible n-length binary strings have exactly thesame number of zeros and ones. Or in the language of probability, that is the approximate chance of gettingexactly the same number of heads and tails when tossing a large (even) number of coins.Could this√n have something to do with what we had observed earlier experimentally in a sequence of cointosses? We’ll see later in the course.Card ShufflingThe random experiment consists of shuffling a deck of cards. Ω is equal to the set of the 52! permutations ofthe deck. The probability space is uniform. Note that we’re really talking about an idealized mathematicalmodel of shuffling here; in real life, there will always be a bit of bias in our shuffling. However, themathematical model is close enough to be useful.Poker HandsHere’s another experiment: shuffling a deck of cards and dealing a poker hand. In this case, S is the set of 52cards and our sample space Ω = {all possible poker hands}, which corresponds to choosing k = 5 objectswithout replacement from a set of size n = 52 where order does not matter. Hence, as we saw in the previousNote, |Ω| =525=52×51×50×49×485×4×3×2×1= 2, 598, 960. Since the deck is assumed to be randomly shuffled, theprobability of each outcome is equally likely and we are therefore dealing with a uniform probability space.Let A be the event that the poker hand is a flush. [For those who are not (yet) addicted to gambling, a flush isa hand in which all cards have the same suit, say Hearts.] Since the probability space is uniform, computingPr[A] reduces to simply computing |A|, or the number of poker hands which are flushes. There are 13 cardsin each suit, so the number of flushes in each suit is135. The total number of flushes is therefore 4 ·135.Then we havePr[hand is a flush] =4 ·135525=4 ·13! ·5! ·47!5! ·8! ·52!=4 ·13 ·12 ·11 ·10 ·952 ·51 ·50 ·49 ·48≈ 0.002.EECS 70, Spring 2014, Note 12 2As an exercise, you should compare to what the Stirling’s approximation would yield for the above exactcalculation.Balls and BinsIn this experiment, we will throw 20 (labeled) balls into 10 (labeled) bins. Assume that each ball is equallylikely to land in any bin, regardless of what happens to the other balls.If you wish to understand this situation in terms of sampling a sequence of k elements from a set S ofcardinality n: here the set S consists of the 10 bins, and we are sampling with replacement k = 20 times.The order of sampling matters, since the balls are labeled.The sample space Ω is equal to {(b1, b2, . . . , b20) : 1 ≤ bi≤ 10}, where the component bidenotes the bin inwhich ball i lands. The cardinality of the sample space, |Ω|, is equal to 1020- each element biin the sequencehas 10 possible choices, and there are 20 elements in the sequence. More generally, if we throw m balls inton bins, we have a sample space of size nm. The probability space is uniform; as we said earlier, each ball isequally likely to land in any bin.Let A be the event that bin 1 is empty. Since the probability space is uniform, we simply need to


View Full Document

Berkeley COMPSCI 70 - n12

Documents in this Course
Notes 1

Notes 1

12 pages

Note 2

Note 2

6 pages

Notes

Notes

6 pages

Notes

Notes

7 pages

Note 10

Note 10

8 pages

n20

n20

10 pages

n19

n19

10 pages

n18

n18

10 pages

n17

n17

6 pages

n16

n16

9 pages

n15

n15

10 pages

n14

n14

9 pages

n13

n13

6 pages

n11

n11

6 pages

n10

n10

8 pages

n9

n9

5 pages

n8

n8

7 pages

n7

n7

8 pages

n6

n6

5 pages

n5

n5

8 pages

n4

n4

7 pages

n3

n3

10 pages

n2

n2

7 pages

n1

n1

5 pages

Load more
Download n12
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view n12 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view n12 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?