Slide 1Slide 2Slide 3Slide 4Slide 5Random VariableProbability DistributionFlip penny and nickel (unbiased)Flip penny and nickel (biased)Slide 10An event is a subsetSlide 12New concept: Random VariablesRandom VariablesE.g., tossing a fair coin n timesNotational conventionsTwo views of random variablesTwo coins tossedSlide 19Slide 20Slide 21Two diceIt’s a floor wax and a dessert toppingFrom Random Variables to EventsSlide 25Slide 26Slide 27From Events to Random VariablesDefinition: expectationThinking about expectationA quick calculation…Type checkingIndicator R.V.s: E[XA] = Pr(A)Adding Random VariablesSlide 35Slide 36IndependenceLinearity of ExpectationSlide 39Slide 40Slide 41By inductionSlide 43Slide 44Use Linearity of ExpectationSlide 46Slide 47Slide 48Slide 49Slide 50Slide 51What about Products?But it is true if RVs are independentSlide 54Slide 55Another problemSlide 57Slide 58Slide 59Slide 60Step right up…AnalysisSlide 63What’s going on?Slide 65Slide 66Great Theoretical Ideas In Computer ScienceSteven Rudich, Anupam Gupta CS 15-251 Spring 2005Lecture 21 March 29, 2005Carnegie Mellon University15-251ClassicsToday, we will learn about a formidable tool in probability that will allow us to solve problems that seem really really messy…If I randomly put 100 letters into 100 addressed envelopes, on average how many letters will end up in their correct envelopes?Hmm… k k¢Pr(exactly k letters end up incorrect envelopes)= k k¢ (…aargh!!…)On average, in class of size m, how many pairs of people will have the same birthday?k k¢ Pr(exactly k collisions)= k k¢ (…aargh!!!!…)The new tool is called “Linearity ofExpectation”Expectatus Linearitus HMURandom VariableRandom VariableTo use this new tool, we will also need to understand the concept of a Random VariableRandom VariableToday’s goal: not too much material, but to understand it well.Probability DistributionA (finite) probability distribution D • a finite set S of elements (samples)• each x2S has weight or probability p(x) 2 [0,1]Sweights must sum to 10.30.30.20.10.050.050“Sample space”TTHTTHHH¼¼¼¼SFlip penny and nickel (unbiased)TTHTTHHH(1-p)2p(1-p)p(1-p)p2SFlip penny and nickel (biased)heads probability = pProbability DistributionS0.30.30.20.10.050.050An event is a subsetS0.30.30.20.10.050.050Pr[A] = x 2 A p(x) = 0.55ARunning ExampleI throw a white die and a black die.Sample space S = { (1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6) }Pr[E] = |E|/|S| = proportion of E in S = 3/36Pr(x) = 1/368 x 2 SE = event that sum ≤ 3New concept: Random VariablesRandom VariablesRandom Variable: a (real-valued) function on SExamples: X = value of white die.X(3,4) = 3, X(1,6) = 1 etc. Y = sum of values of the two dice.Y(3,4) = 7, Y(1,6) = 7 etc. W = (value of white die)value of black dieW(3,4) = 34Y(1,6) = 16 Z = (1 if two dice are equal, 0 otherwise)Z(4,4) = 1, Z(1,6) = 0 etc.Toss a white die and a black die.Sample space S = { (1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6) }E.g., tossing a fair coin n timesS = all sequences of {H, T}nD = uniform distribution on S D(x) = (½)n for all x 2 SRandom Variables (say n = 10) X = # of headsX(HHHTTHTHTT) = 5 Y = (1 if #heads = #tails, 0 otherwise)Y(HHHTTHTHTT) = 1, Y(THHHHTTTTT) = 0Notational conventionsUse letters like A, B, E for events.Use letters like X, Y, f, g for R.V.’s.R.V. = random variableTwo views of random variablesThink of a R.V. as • a function from S to the reals • or think of the induced distribution on Two coins tossedX: {TT, TH, HT, HH} -> {0, 1, 2}counts the number of heads012TTHTTHHH¼¼¼¼STwo views of random variablesThink of a R.V. as • a function from S to the reals • or think of the induced distribution on Two coins tossedX: {TT, TH, HT, HH} -> {0, 1, 2}counts the number of heads012¼½¼TTHTTHHH¼¼¼¼SDistribution on therealsTwo views of random variablesThink of a R.V. as • a function from S to the reals • or think of the induced distribution on Two diceI throw a white die and a black die.Sample space S = { (1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6) }X = sum of both dicefunction with X(1,1) = 2, X(1,2)=X(2,1)=3, …, X(6,6)=12It’s a floor wax It’s a floor wax andand a dessert a dessert toppingtoppingIt’s a variable with a probability distribution on its values.It’s a function on the sample space S.You should be comfortable with both views.From Random Variables to From Random Variables to EventsEventsFor any random variable X and value a, we can define the event A that X=a. Pr(A) = Pr(X=a) = Pr({x 2 S| X(x)=a}).Two coins tossedX: {TT, TH, HT, HH} -> {0, 1, 2}counts the number of heads012¼½¼TTHTTHHH¼¼¼¼SDistribution on XPr(X = 1)= Pr({x 2 S| X(x) = 1})= Pr({TH, HT}) = ½.XPr(X = a) = Pr({x S| X(x) = a})Two diceI throw a white die and a black die. X = sumSample space S = { (1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6) }Pr(X = 6)= Pr( {x S | X(x) = 6} )= 5/36. Pr(X = a) = Pr({x S| X(x) = a})From Random Variables to From Random Variables to EventsEventsFor any random variable X and value a, we can define the event A that X=a. Pr(A) = Pr(X=a) = Pr({x 2 S| X(x)=a}).X has adistribution on its valuesX is a function on the sample space SFrom Events to Random
View Full Document