Slide 1Slide 2Slide 3Slide 4Slide 5Slide 6Slide 7Slide 8Slide 9Slide 10Slide 11Slide 12Slide 13Slide 14Slide 15Slide 16Slide 17Slide 18Slide 19Slide 20Slide 21Slide 22Slide 23Slide 24Slide 25Slide 26Slide 27Slide 28Slide 29Slide 30Slide 31Slide 32Slide 33Slide 34Slide 35Slide 36Slide 37Slide 38Slide 39Slide 40Slide 41Slide 42Slide 43Slide 44Slide 45Slide 46Slide 47Slide 48Slide 49Slide 50Slide 51Slide 52Slide 5315-251Great Theoretical Ideas in Computer ScienceFinite Probability DistributionA (finite) probability distribution D is a finite “sample space” S of elements or “samples”, where each sample x in S has a non-negativereal weight or probability p(x) p(x) = 1x SThe weights must satisfy:Sample space0.20.130.060.110.170.10.1300.1D(x) = p(x) = 0.1weight or probability of xEventsAny set E S is called an event p(x)x EPrD[E] = S0.170.10.130PrD[E] = 0.4and is defined to be = SABproportion of A B Conditional ProbabilityThe probability of event A given event B is written Pr[ A | B ]to BPr [ A B ] Pr [ B ]and is defined to be = Conditional ProbabilityThe probability of event A given event B is written Pr[ A | B ]Pr [ A B ] Pr [ B ]A and B are independent events ifPr[ A | B ] = Pr[ A ] Pr[ A B ] = Pr[ A ] Pr[ B ] Pr[ B | A ] = Pr[ B ]15-251ClassicsLecture 11 (October 2, 2007)Today, we will learn about a formidable tool in probability that will allow us to solve problems that seem really really messy…If I randomly put 100 letters into 100 addressed envelopes, on average how many letters will end up in their correct envelopes?= k k (…aargh!!…)Hmm… k k Pr(k letters end up incorrect envelopes)On average, in class of size m, how many pairs of people will have the same birthday?k k Pr(exactly k collisions)= k k (…aargh!!!!…)The new tool is called “Linearity ofExpectation”Random VariableTo use this new tool, we will also need to understand the concepts of Random VariableandExpectationsToday’s lecture: not too much material, but need to understand it wellRandom VariableA Random Variable is a real-valued function on SLet S be sample space in a probability distributionSSample space Random VariableA Random Variable is a real-valued function on SExamples:X = value of white die in a two-dice rollX(3,4) = 3, X(1,6) = 1Y = sum of values of the two diceY(3,4) = 7, Y(1,6) = 7W = (value of white die)value of black dieW(3,4) = 34, Y(1,6) = 16Let S be sample space in a probability distributionTossing a Fair Coin n TimesS = all sequences of {H, T}nD = uniform distribution on S D(x) = (½)n for all x 2 SRandom Variables (say n = 10)X = # of headsX(HHHTTHTHTT) = 5Y = (1 if #heads = #tails, 0 otherwise)Y(HHHTTHTHTT) = 1, Y(THHHHTTTTT) = 0Notational ConventionsUse letters like A, B, E for eventsUse letters like X, Y, f, g for R.V.’sR.V. = random variableTwo Views of Random VariablesInput to the function is randomRandomness is “pushed” to the values of the functionThink of a R.V. as A function from S to the reals Or think of the induced distribution on 012TTHTTHHH¼¼¼¼STwo Coins TossedX: {TT, TH, HT, HH} → {0, 1, 2} counts the number of heads¼½¼Distribution on therealsIt’s a Floor Wax And a Dessert ToppingIt’s a function on the sample space SIt’s a variable with a probability distribution on its valuesYou should be comfortable with both viewsFrom Random Variables to EventsFor any random variable X and value a, we can define the event A that “X = a”Pr(A) = Pr(X=a) = Pr({x S| X(x)=a})Pr(X = a) = Pr({x S| X(x) = a})Two Coins TossedX: {TT, TH, HT, HH} → {0, 1, 2} counts # of heads012TTHTTHHH¼¼¼¼S¼½¼Distribution on XX= Pr({x S| X(x) = 1})= Pr({TH, HT}) = ½Pr(X = 1)0.30.30.20.10.050.0500.450.5510From Events to Random VariablesFor any event A, can define the indicator random variable for A:XA(x) = 1 if x A0 if x AX has adistribution on its valuesX is a function on the sample space SDefinition: ExpectationThe expectation, or expected value of a random variable X is written as E[X], and is Pr(x) X(x) = k Pr[X = k]x S kE[X] =A Quick Calculation…What if I flip a coin 2 times? What is the expected number of heads?012TTHTTHHH¼¼¼¼S¼½¼X Pr(x) X(x) = k Pr[X = k]x SkE[X] =A Quick Calculation…What if I flip a coin 3 times? What is the expected number of heads?E[X] = (1/8)×0 + (3/8)×1 + (3/8)×2 + (1/8)×3 = 1.5But Pr[ X = 1.5 ] = 0Moral: don’t always expect the expected.Pr[ X = E[X] ] may be 0 !Type CheckingA Random Variable is the type of thing you might want to know an expected value ofIf you are computing an expectation, the thing whose expectation you are computing is a random variableE[XA] = 1 × Pr(XA = 1) = Pr(A)Indicator R.V.s: E[XA] = Pr(A)For any event A, can define the indicator random variable for A:XA(x) = 1 if x A0 if x A10AAdding Random VariablesE.g., rolling two dice. X = 1st die, Y = 2nd die, Z = sum of two diceIf X and Y are random variables (on the same set S), then Z = X + Y is also a random variableZ(x) = X(x) + Y(x)Adding Random VariablesExample: Consider picking a random person in the world. Let X = length of the person’s left arm in inches. Y = length of the person’s right arm in inches. Let Z = X+Y. Z measures the combined arm lengthsIndependenceTwo random variables X and Y are independent if for every a,b, the events X=a and Y=b are independentHow about the case of X=1st die, Y=2nd die? X = left arm, Y=right arm?Linearity of ExpectationIf Z = X+Y, thenE[Z] = E[X] + E[Y]Even if X and Y are not independent!E[Z] Pr[x] Z(x)x S Pr[x] (X(x) + Y(x))x S Pr[x] X(x) + Pr[x] Y(x))x Sx SE[X] + E[Y]====1,0,1HT0,0,0TT0,1,1TH1,1,2HHLinearity of ExpectationE.g., 2 fair flips: X = 1st coin, Y = 2nd coin Z = X+Y = total # headsWhat is E[X]? E[Y]? E[Z]?1,0,1HT0,0,0TT1,0,1TH1,1,2HHLinearity of ExpectationE.g., 2 fair flips: X = at least one coin is headsY = both coins are heads, Z = X+YAre X and Y independent?What is E[X]? E[Y]? E[Z]?By InductionE[X1 + X2 + … + Xn] = E[X1] + E[X2] + …. + E[Xn]The expectation of the sum =The sum of the expectationsIt is finally time to show off ourprobabilityprowess…If I randomly put 100 letters into 100 addressed envelopes, on average how many letters will end up in their correct envelopes?Hmm… k k Pr(k letters end up incorrect envelopes)= k k (…aargh!!…)Use Linearity of ExpectationXi = 1 if Ai occurs0 otherwiseLet Ai be the
View Full Document