DOC PREVIEW
MIT 6 042J - Expected Value I

This preview shows page 1-2-3 out of 9 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Betting on CoinsEquivalent Definitions of ExpectationMean Time to FailureMaking a Baby GirlAn Expectation ParadoxLinearity of ExpectationExpected Value of Two DiceThe Hat-Check ProblemThe Chinese Appetizer Problem� � � 6.042/18.062J Mathematics for Computer Science May 3, 2005 Srini Devadas and Eric Lehman Lecture Notes Expected Value I The expectation or expected value of a random variable is a single number that tells you a lot about the behavior of the variable. Roughly, the expectation is the average value, where each value is weighted according to the probability that it comes up. Formally, the expected value of a random variable R defined on a sample space S is: Ex (R) = R(w) Pr (w) w∈S To appreciate its signficance, suppose S is the set of students in a class, and we select a student uniformly at random. Let R be the selected student’s exam score. Then Ex (R) is just the class average— the first thing everyone want to know after getting their test back! In the same way, expectation is usually the first thing one wants to determine about any random variable. Let’s work through an example. Let R be the number that comes up on a fair, six-sided die. Then the expected value of R is: 6� 1 Ex (R) = k 6 k=1 1 1 1 1 1 1 = 1 · + 2 · + 3 · + 4 · + 5 · + 6 ·6 6 6 6 6 6 7 = 2 This calculation shows that the name “expected value” is a little misleading; the random variable might never actually take on that value. You can’t roll a 31 2 on an ordinary die! 1 Betting on Coins Dan, Eric, and Nick decide to play a fun game. Each player puts $2 on the table and secretly writes down either “heads” or “tails”. Then one of them tosses a fair coin. The $6 on the table is divided evenly among the players who correctly predicted the outcome of the coin toss. If everyone guessed incorrectly, then everyone takes their money back. After many repetitions of this game, Dan has lost a lot of money— more than can be explained by bad luck. What’s going on? A tree diagram for this problem is worked out below, under the assumptions that everyone guesses correctly with probability 1/2 and everyone is correct independently.2 Expected Value I 1/81/81/81/81/81/81/81/8probabilityDan right?Eric right?Nick right?Dan’s payoffYNYYYYYYNNNNNN $0−$2−$2−$2$4$1$1$01/21/21/21/21/21/21/21/21/21/21/21/21/21/2In the “payoff” column, we’re accounting for the fact that Dan has to put in $2 just to play. So, for example, if he guesses correctly and Eric and Nick are wrong, then he takes all $6 on the table, but his net profit is only $4. Working from the tree diagram, Dan’s expected payoff is: 1 1 1 1 1 1 1 1 Ex (payoff) = 0 · + 1 · + 1 · + 4 · + (−2) · + (−2) · + (−2) · + 0 ·8 8 8 8 8 8 8 8 = 0 So the game perfectly fair! Over time, he should neither win nor lose money. The trick is that Nick and Eric are collaborating; in particular, they always make oppo-site guesses. So our assumption everyone is correct independently is wrong; actually the events that Nick is correct and Eric is correct are mutually exclusive! As a result, Dan can never win all the money on the table. When he guesses correctly, he always has to split his winnings with someone else. This lowers his overall expectation, as the corrected tree diagram below shows:3 Expected Value I probabilityDan right?Eric right?Nick right?Dan’s payoffYNYYYYYYNNNNNN $0−$2−$2−$2$4$1$1$01/21/21/21/21/21/20101/4001/41/401/4010110From this revised tree diagram, we can work out Dan’s actual expected payoff: 1 1 1 1 Ex (payoff) = 0 · 0 + 1 · + 1 · + 4 · 0 + (−2) · 0 + (−2) · + (−2) · + 0 · 0 4 4 4 4 1 = − 2 So he loses an average of a half-dollar per game! Similar opportunities for subtle cheating come up in many betting games. For exam-ple, a group of friends once organized a football pool where each participant would guess the outcome of every game each week relative to the spread. This may mean nothing to you, but the upshot is that everyone was effectively betting on the outcomes of 12 or 13 coin tosses each week. The person who correctly predicts the most coin tosses won a lot of money. The organizer, thinking in terms of the first tree diagram, swore up and down that there was no way to get an unfair “edge”. But actually the number of participants was small enough that just two players betting oppositely could gain a substantial advantage! Another example involves a former MIT professor of statistics, Herman Chernoff. State lotteries are the worst gambling games around because the state pays out only a fraction of the money it takes in. But Chernoff figured out a way to win! Here are rules for a typical lottery: • All players pay $1 to play and select 4 numbers from 1 to 36. • The state draws 4 numbers from 1 to 36 uniformly at random.� � � � � � � � 4 Expected Value I • The state divides 1/2 the money collected among the people who guessed correctly and spends the other half repairing the Big Dig. This is a lot like our betting game, except that there are more players and more choices. Chernoff discovered that a small set of numbers was selected by a large fraction of the population— apparently many people think the same way. It was as if the players were collaborating to lose! If any one of them guessed correctly, then they’d have to split the pot with many other players. By selecting numbers uniformly at random, Chernoff was unlikely to get one of these favored sequences. So if he won, he’d likely get the whole pot! By analyzing actual state lottery data, he determined that he could win an average of 7 cents on the dollar this way! 2 Equivalent Definitions of Expectation There are some other ways of writing the definition of expectation. Sometimes using one of these other formulations can make computing an expectation a lot easier. One option is to group together all outcomes on which the random variable takes on the same value. Theorem 1. � Ex (R) = x · Pr (R = x) x∈ range(R) Proof. We’ll transform the left side into the right. Let [R = x] be the event that R = x. Ex (R) = R(w) Pr (w) w∈S = R(w) Pr (w) x∈ range(R) w∈[R=x] = x Pr (w) x∈ range(R) w∈[R=x]⎛ = ⎝x ⎞ Pr (w)⎠ · x∈ range(R) w∈[R=x] = x · Pr (R = x) x∈ range(R) On the second line, we break the single sum into two. The outer sum runs over all possible


View Full Document

MIT 6 042J - Expected Value I

Documents in this Course
Counting

Counting

55 pages

Graphs

Graphs

19 pages

Proofs

Proofs

14 pages

Proofs

Proofs

18 pages

Proofs

Proofs

18 pages

Quiz 1

Quiz 1

9 pages

Quiz 2

Quiz 2

11 pages

Load more
Download Expected Value I
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Expected Value I and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Expected Value I 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?