DOC PREVIEW
UIUC MATH 370 - Lec 7 Handout

This preview shows page 1 out of 3 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

(1) Joint Distribution of 2 random variables(2) Marginal distribution(3) Conditional distribution of Y given X=x(4) Variance, Covariance, Coefficient of correlation between 2 random variables(5) Moment generating function of a joint distribution(6) Independence of 2 random variablesMath 370X - Lecture 7Joint, Marginal and Conditional DistributionsSaumil PadhyaOctober 24, 2016(1) Joint Distribution of 2 random variablesIntuition: Tossing of a die can result in 1 outcome - 1, 2, 3, 4, 5, or 6. Tossing of 2 dice can resultin pairs of outcomes - (1,1), (2,1), etc. . .If X and Y are discrete random variables, then f(x,y) = P[(X=x)∩(Y=y)] is the joint probabilityfunction.0 ≤ f(x, y) ≤ 1XxXyf(x, y) = 1If X and Y are continuous random variables, then f(x,y) must satisfyf(x, y) ≥ 0∞Z−∞∞Z−∞f(x, y) dy dx = 1The cdf of joint distribution is defined asF (x, y) = P [(X ≤ x) ∩ (Y ≤ y)]For continuous distributions,F (x, y) =xZ−∞yZ−∞f(s, t) dt ds∂2∂x∂yF (x, y) = f(x, y)For discrete distributions,F (x, y) =xXs=−∞yXt=−∞f(s, t)1Expectation of joint distribution is defined asE[h(X, Y )] =XxXyh(x, y) · f(x, y) in discrete caseE[h(X, Y )] =∞Z−∞∞Z−∞h(x, y) · f(x, y) dy dx in continuous case(2) Marginal distributionfX(x) =Xyf(x, y) =∞Z−∞f(x, y) dyfY(y) =Xxf(x, y) =∞Z−∞f(x, y) dx(3) Conditional distribution of Y given X=xfY |X(y|X = x) =f(x, y)fX(x)f(x, y) = fY |Xy|X = x · fX(x)E[Y |X = x] =Zy · fY |X(y|X = x)dyE[Yk|X = x] =Zyk· fY |X(y|X = x)dyV ar[Y |X = x] = E[Y2|X = x] − (E[Y |X = x])2∗ ∗∞Z−∞fY |X(y|X = x)dy = 1 ∗ ∗(4) Variance, Covariance, Coefficient of correlation between 2 random variablesCov[X, Y ] = E(X − E[X]) · E(Y − E[Y ]) = E[XY ] − E[X] · E[Y ]V ar[aX + bY + c] = a2· V ar[X] + b2· V ar[Y ] + 2ab · Cov[X, Y ]p(X, Y ) = pX,Y=Cov[X, Y ]σx· σywhere − 1 ≤ p(X, Y ) ≤ 1(5) Moment generating function of a joint distributionMX,Y(t1, t2) = E[et1X+t2Y]E[Xn· Ym] =∂n+m∂nt1∂mt2MX,Y(t1, t2)t1=t2=02(6) Independence of 2 random variablesWhen 2 random variables are independent, the above formulas become really simple. Here are afew results:f(x, y) = fX(x) · fY(y) − (1)F (x, y) = FX(x) · FY(y) − (2)From (1) and (2), we getfY |X(y|X = x) =f(x, y)fX(x)=fX(x) · fY(y)fX(x)= fY(y) − (3)E[g(X)g(Y )] = E[g(X)] · E[g(Y )] − (4)From (4), we getE[XY ] = E[X] · E[Y ] − (5)Cov[X, Y ] = E[XY ] − E[X] · E[Y ] = E[X] · E[Y ] − E[X] · E[Y ] = 0 − (6)V ar[aX + bY + c] = a2· V ar[X] + b2· V ar[Y ] + 2ab · Cov[X, Y ] = a2· V ar[X] + b2· V ar[Y ] −


View Full Document

UIUC MATH 370 - Lec 7 Handout

Download Lec 7 Handout
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lec 7 Handout and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lec 7 Handout 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?