DOC PREVIEW
MIT 6 042J - Random Variables

This preview shows page 1-2-14-15-30-31 out of 31 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Chapter 20 Random Variables So far we focused on probabilities of events —that you win the Monty Hall game; that you have a rare medical condition, given that you tested positive; . . . . Now we focus on quantitative questions: How many contestants must play the Monty Hall game until one of them finally wins? . . . How long will this condition last? How much will I lose playing 6.042 games all day? Random variables are the mathemat-ical tool for addressing such questions. 20.1 Random Variable Examples Definition 20.1.1. A random variable, R, on a probability space is a total function whose domain is the sample space. The codomain of R can be anything, but will usually be a subset of the real numbers. Notice that the name “random variable” is a misnomer; random vari-ables are actually functions! For example, suppose we toss three independent, unbiased coins. Let C be the number of heads that appear. Let M = 1 if the three coins come up all heads or all tails, and let M = 0 otherwise. Now every outcome of the three coin flips uniquely determines the values of C and M. For example, if we flip heads, tails, heads, then C = 2 and M = 0. If we flip tails, tails, tails, then C = 0 and M = 1. In effect, C counts the number of heads, and M indicates whether all the coins match. Since each outcome uniquely determines C and M, we can regard them as functions mapping outcomes to numbers. For this experiment, the sample space is: S = {HHH, HHT, HT H, HT T, T HH, T HT, T T H, T T T } . Now C is a function that maps each outcome in the sample space to a number as 459460 CHAPTER 20. RANDOM VARIABLES follows: C(HHH) = 3 C(T HH) = 2 C(HHT ) = 2 C(T HT ) = 1 C(HT H) = 2 C(TT H) = 1 C(HT T ) = 1 C(T T T ) = 0. Similarly, M is a function mapping each outcome another way: M(HHH) = 1 M (T HH) = 0 M (HHT ) = 0 M (THT ) = 0 M (HT H) = 0 M(T T H) = 0 M(HT T ) = 0 M(T T T ) = 1. So C and M are random variables. 20.1.1 Indicator Random Variables An indicator random variable is a random variable that maps every outcome to ei-ther 0 or 1. These are also called Bernoulli variables. The random variable M is an example. If all three coins match, then M = 1; otherwise, M = 0. Indicator random variables are closely related to events. In particular, an in-dicator partitions the sample space into those outcomes mapped to 1 and those outcomes mapped to 0. For example, the indicator M partitions the sample space into two blocks as follows: HHH T T T HHT HT H HT T T HH THT T T H .� �� � � �� � M = 1 M = 0 In the same way, an event, E, partitions the sample space into those outcomes in E and those not in E. So E is naturally associated with an indicator random variable, IE , where IE (p) = 1 for outcomes p ∈ E and IE (p) = 0 for outcomes p /∈ E. Thus, M = IF where F is the event that all three coins match. 20.1.2 Random Variables and Events There is a strong relationship between events and more general random variables as well. A random variable that takes on several values partitions the sample space into several blocks. For example, C partitions the sample space as follows: T T T T T H T HT HT T T HH HT H HHT HHH .���� � �� � � �� � ��� � C = 0 C = 1 C = 2 C = 3 Each block is a subset of the sample space and is therefore an event. Thus, we can regard an equation or inequality involving a random variable as an event. For example, the event that C = 2 consists of the outcomes T HH, HT H, and HHT . The event C ≤ 1 consists of the outcomes T T T , T T H, T HT , and HT T .20.1. RANDOM VARIABLE EXAMPLES 461 Naturally enough, we can talk about the probability of events defined by prop-erties of random variables. For example, Pr {C = 2} = Pr {T HH} + Pr {HTH} + Pr {HHT }1 1 1 3 = + + = .8 8 8 8 20.1.3 Independence The notion of independence carries over from events to random variables as well. Random variables R1 and R2 are independent iff for all x1 in the codomain of R1, and x2 in the codomain of R2, we have: Pr {R1 = x1 AND R2 = x2} = Pr {R1 = x1} · Pr {R2 = x2} . As with events, we can formulate independence for random variables in an equiv-alent and perhaps more intuitive way: random variables R1 and R2 are indepen-dent if for all x1 and x2 Pr {R1 = x1 | R2 = x2} = Pr {R1 = x1} . whenever the lefthand conditional probability is defined, that is, whenever Pr {R2 = x2} > 0. As an example, are C and M independent? Intuitively, the answer should be “no”. The number of heads, C, completely determines whether all three coins match; that is, whether M = 1. But, to verify this intuition, we must find some x1, x2 ∈ R such that: Pr {C = x1 AND M = x2} =� Pr {C = x1} · Pr {M = x2} . One appropriate choice of values is x1 = 2 and x2 = 1. In this case, we have: 1 3Pr {C = 2 AND M = 1} = 0 =�4 · 8= Pr {M = 1} · Pr {C = 2} . The first probability is zero because we never have exactly two heads (C = 2) when all three coins match (M = 1). The other two probabilities were computed earlier. On the other hand, let H1 be the indicator variable for event that the first flip is a Head, so [H1 = 1] = {HHH, HT H, HHT, HT T } . Then H1 is independent of M, since Pr {M = 1} = 1/4 = Pr {M = 1 | H1 = 1} = Pr {M = 1 | H1 = 0}Pr {M = 0} = 3/4 = Pr {M = 0 | H1 = 1} = Pr {M = 0 | H1 = 0} This example is an instance of a simple lemma: Lemma 20.1.2. Two events are independent iff their indicator variables are independent.462 CHAPTER 20. RANDOM VARIABLES As with events, the notion of independence generalizes to more than two ran-dom variables. Definition 20.1.3. Random variables R1, R2, . . . , Rn are mutually independent iff Pr {R1 = x1 AND R2 = x2 AND AND Rn = xn}··· = Pr {R1 = x1} · Pr …


View Full Document

MIT 6 042J - Random Variables

Documents in this Course
Counting

Counting

55 pages

Graphs

Graphs

19 pages

Proofs

Proofs

14 pages

Proofs

Proofs

18 pages

Proofs

Proofs

18 pages

Quiz 1

Quiz 1

9 pages

Quiz 2

Quiz 2

11 pages

Load more
Download Random Variables
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Random Variables and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Random Variables 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?