DOC PREVIEW
UCLA STATS 100A - July 25th- Lecture part II

This preview shows page 1-2-3-4 out of 11 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

13.7.24 1 Summer 2013 Stat 100/Sanchez-Intro Probability 1 Jointly Distributed Discrete Random Variables Two variable case This topic is all scrambled throughout Chapter 6 in Ross, and mixed with the discussion of the continuous case. Announcements • Last, Homework 5 due Tuesday, July 30th • Midterm 2 on August 1st in Broad 2160E, same room as midterm 1. Summer 2013 Stat 100/Sanchez-Intro Probability 2 Summer 2013 Stat 100/Sanchez-Intro Probability 3 Today: discrete bivariate distributions • 1. Two random variables. Their joint distribution. Distributions that can be obtained from a joint distribution (marginal distributions, conditional distributions) • Independence of two random variables • 2. Functions of two random variables • 3. Covariance as a function of two random variables • 4. Relations between covariances • 5. The multinomial family of multivariate discrete distributions.13.7.24 2 Summer 2013 Stat 100/Sanchez-Intro Probability 4 The Joint probability mass function. Let X and Y be discrete random variables. The joint probability mass function of X and Y is given by: € P x, y( )= P X = x,Y = y( )P x, y( )≥ 0P x, y( )= 1∑∑all x,y⎧ ⎨ ⎪ ⎩ ⎪ Defined for all real numbers x and y. Summer 2013 Stat 100/Sanchez-Intro Probability 5 The marginal probability mass functions, marginal expectations and marginal variances The marginal probability mass functions of X and Y respectively, are given by € Marginal for X : P x( )= P x, y( )y∑E(X) = xP(x) Var(X) = x - E(X)( )2P(X )x∑x∑Marginal for Y : P y( )= P x, y( )x∑E(Y ) = yP(y) Var(Y) = y - E(Y)( )2P(Y )y∑y∑Summer 2013 Stat 100/Sanchez-Intro Probability 6 The conditional probability mass functions, conditional expectations and variance € P X Y = y( )=P X and Y = y( )P Y = y( )E(X |Y = y) = xP(Xx∑|Y = y) Var(X | Y = y) = x - E(X)( )2x∑P(X |Y = y)P Y X = x( )=P X = x and Y( )P X = x( )E(Y | X = x) = yP(Yy∑| X = x) Var(Y | X = x) = y - E(Y)( )2x∑P(Y | X = x)There are as many conditional distributions of X given Y as values of Y. Similarly, there are as many conditional distributions of Y given X as values of X. 13.7.24 3 Summer 2013 Stat 100/Sanchez-Intro Probability 7 Example 1 • A firm that sells word processing systems keeps track of the number of customers who call on any one day and the number of orders placed on any one day. Let X denote the number of calls and Y the number of orders placed, and let P(X,Y) denote the joint probability mass function for X and Y. Records indicate that P(0,0)=0.04, P(1,0)=0.16, P(1,1)=0.1, P(2,0)=0.2, P(2,1)=0.3, P(2,2)=0.2. Thus, for any given day, the probability of, say, two calls and 1 order is 0.3. Summer 2013 Stat 100/Sanchez-Intro Probability 8 Tabular representation of the Joint pmf, P(X=x, Y=y) Y 0 1 2 0 0.04 0.16 0.2 1 0 0.1 0.3 2 0 0 0.2 X=number of calls Y=# orders X Summer 2013 Stat 100/Sanchez-Intro Probability 9 Finding joint joint probabilities of events Example: event (X<2 and Y>0) y 0 1 2 0 0.04 0.16 0.2 1 0 0.1 0.3 2 0 0 0.2 X=number of calls Y=# orders X P(X<2, Y>0)=P(X=0,Y=1)+P(X=0,Y=2)+P(X=1,Y=1)+P(X=1,Y=2)=0.113.7.24 4 Stat 100/Sanchez-Intro Probability 10 y 0 1 2 0 0.04 0.16 0.2 1 0 0.1 0.3 2 0 0 0.2 P(X) 0.04 0.26 0.7 X=number of calls Y=# orders X MARGINAL PMF FOR X,THE NUMBER OF CALLS- Column sum ∑===YYxXPxXP ),()(Summer 2013 Stat 100/Sanchez-Intro Probability 11 Marginal Expected Value and Variance of X X P(X) 0 0.04 1 0.26 2 0.7 ( )[ ]5517.03044.0)(3044.066.1)7.0(4)26.0(1)()()()()(66.1)7.0(2)26.0(1)04.0(0)()(22222====−+=−=−===++===∑∑XSdXEXEXPXEXXVarXXPXExxσσµSummer 2013 Summer 2013 Stat 100/Sanchez-Intro Probability 12 Marginal pmf for Y (row sum) y 0 1 2 P(Y) 0 0.04 0.16 0.2 0.4 1 0 0.1 0.3 0.4 2 0 0 0.2 0.2 X=number of calls Y=# orders X ∑===XyYXPyYP ),()(13.7.24 5 Summer 2013 Stat 100/Sanchez-Intro Probability 13 Marginal Expected Value and Variance of Y Y P(Y) 0 0.4 1 0.4 2 0.2 € µ= E (Y) = YP(Y) = 0(0.4) +1(0.4) + 2(0.2) = 0.8y∑σ2= Var(Y) = Y − E (Y)( )y∑2P(Y ) = E(Y2) − E(Y)[ ]2= 1(0.4) + 4(0.2) − 0.82= 0.56σ= Sd(Y) = 0.56 = 0.7483Summer 2013 Stat 100/Sanchez-Intro Probability 14 Independence of the two random variables • The random variables X and Y are independent if P(X,Y)=P(X)P(Y) for all values of x and y. • In our calls-orders example, P(X=0,Y=0)=0.04, P(X=0)=0.04 and P(Y=0)=0.4. So P(X=0)P(Y=0)=0.016 So P(X=0,Y=0)≠P(X=0)P(Y=0) and therefore, X and Y are not independent. All we need is to find one pair for which the condition does not hold. However, if we had found that they were equal, we would have to check the condition for all X and Y. Summer 2013 Stat 100/Sanchez-Intro Probability 15 Conditional probability mass functions in our example13.7.24 6 Summer 2013 Stat 100/Sanchez-Intro Probability 16 Conditional expectations (examples) • E(X|Y=0)=(0)(0.1)+1(0.4)+2(0.5)=1.4 E(Y|X=2)=0(0.286)+(1)(0.429)+(2)(0.286)= 1.001 Summer 2013 Stat 100/Sanchez-Intro Probability 17 2. Expected Values of functions of two random variables. When we encounter problems that involve more than one random variable, we often combine the variables into a single function. Suppose that the discrete random variables (X,Y) have a joint probability function given by P(x,y). If g(X,Y) is any real-valued function of (X,Y) then € E g X,Y( )[ ]= g(x, y)P(x, y)y∑x∑Summer 2013 Stat 100/Sanchez-Intro Probability 18 Examples of functions of two random variables Let X, Y be two random variables P(X,Y) discrete. We can define the following functions of X and Y. € g x, y( )= xg x, y( )= yg x, y( )= x2€ g x, y( )= y2g x, y( )= xyg x, y( )= x + y€ g x, y( )= x − µx( )2g x, y( )= y − µy( )2g x, y( )= x − µx( )y − µy( )Expected values can be found with he general formula : € E g(X,Y )( )= g x, y( )P(x, y)y∑x∑13.7.24 7 Summer 2013 Stat 100/Sanchez-Intro Probability 19 A very special function: g(x,y)=xy 0 1 20XY=(0)(0)=0 [0.04](1)(0)=0 [0.16](2)(0)=0 [0.2]1(0)(1) = 0 [0](1)(1)=1 [0.1](2)(1)=2 [0.3]2(0)(2) = 0 [0](1)(2)=2 [0](2)(2)=4 [0.2]XYSummer 2013 Stat 100/Sanchez-Intro Probability 20 (cont.) XY P(XY) 0 [0.4]+[0.16]+[0.2]+[0] = 0.4 1 [0.1] = 0.1 2 [0.3]+[0] = 0.3 4 [0.2] = 02 € E XY( )= 0(0.4) +1(0.1) + 2(0.3) + 4(0.2) = 1.5Another special function: g(X,Y) =


View Full Document

UCLA STATS 100A - July 25th- Lecture part II

Documents in this Course
Hwk8

Hwk8

3 pages

HW09Key

HW09Key

3 pages

Normal

Normal

7 pages

clt

clt

3 pages

Load more
Download July 25th- Lecture part II
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view July 25th- Lecture part II and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view July 25th- Lecture part II 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?