DOC PREVIEW
UCLA STATS 100A - July 23rd-25 Lecture partII (posted 722)

This preview shows page 1-2-3-4 out of 13 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 13 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Expectation, Variance and Distribution of Sums ofMany Random Variables.Juana [email protected] Department of StatisticsJuly 22, 2013J. Sanchez Sums of Many Random variablesAnnouncementsHomework 4 was due today at the beginning of lecture.Quiz 3 is done in the TA session, first 10 minutes. You must takethe quiz in the session you are registered in.J. Sanchez Sums of Many Random variablesTodayLinear combinations of two random variables.Linear combinations of several random variables. Expectation andVariance. Independence case. Distribution for sum of independentrandom variables.Sum of a large number of independent random variables. TheCentral Limit Theorem. Chapter 8, section 8.3.J. Sanchez Sums of Many Random variablesI. Linear Combination of Two Random Variables.Applications of probability often call for the use of linearcombinations of several random variables.A study of downtimes of computer systems might require knowledgeof the sum of the downtimes over a day or a week.The total cost of a building project can b e studied as the sum of thecosts for the major components of the project.The size of an animal population can be modeled as the sum of thesizes of the colonies within the population.J. Sanchez Sums of Many Random variablesNotesNotesNotesNotesExpected Value and Variance of the sum of two randomvariablesIf X and Y are two random variables, we have seen when we studiedbivariate distributions thatE(X + Y ) = E (X ) + E(Y )Var (X + Y ) = Var (X ) + Var (Y ) + 2Cov(X , Y )When the two random variables are independent, the covariance is 0.Also, for two random variables X1and X2, U = aX1+ bX2, a, b,constants,(a) E(U) = aE (X1) + bE(X2)(b) Var (U) = a2Var (X1) + b2Var (X2) + 2abCov(X1, X2)J. Sanchez Sums of Many Random variablesDistribution of the sum of two familiar random variablesWe also saw when studying moment generating functions, that for somedistributions, we can use the moment generating function to determinethe distribution of the sum of 2 independent random variables.If X and Y are Poisson random variables with parameters λ1, λ2,respectively, then X + Y is Poisson with parameter λ1+ λ2.If X is a Normal r.v. with parameters µ1, σ1and Y is a Normal r.v.with parameters µ2, σ2then X + Y is normal with mean µ1+ µ2,and variance σ21+ σ22If X is Binomial(n1, p) and Y is Binomial(n2, p), with p the same forX and Y, then X + Y is Binomial (n1+ n2, p).J. Sanchez Sums of Many Random variablesExampleThere is a relationship between daily number of minutes spent texting(X) and the number of minutes spent working in a part time job per day(Y). The correlation between X and Y is believed to be -0.4. X and Y arebelieved to have a bivariate normal distribution with parametersµx= 50, µy= 300, σx= 10, σy= 30, ρ = −0.4, What is the expectedtotal number of hours spent doing those tasks? What is the variance?What is the distribution?E(X + Y ) = 50 + 300 = 550Var (X + Y ) = 102+ 302+ 2(−0.4)(10)(30) = 760Now, X and Y are not independent (their correlation is not 0). Therefore,the distribution of X + Y can not be found by multiplying the momentgenerating functions and then matching the resulting mgf to the one ofthe known ones.J. Sanchez Sums of Many Random variablesExampleThere is a relationship between daily number of minutes spent texting(X) and the number of minutes spent working in a part time job per day(Y). The correlation between X and Y is believed to be -0.4. X and Y arebelieved to have a bivariate normal distribution with parametersµx= 50, µy= 300, σx= 10, σy= 30, ρ = −0.4, What is the expectedtotal number of hours spent doing those tasks? What is the variance?What is the distribution?E(X + Y ) = 50 + 300 = 550Var (X + Y ) = 102+ 302+ 2(−0.4)(10)(30) = 760Now, X and Y are not independent (their correlation is not 0). Therefore,the distribution of X + Y can not be found by multiplying the momentgenerating functions and then matching the resulting mgf to the one ofthe known ones.J. Sanchez Sums of Many Random variablesNotesNotesNotesNotesExampleThere is a relationship between daily number of minutes spent texting(X) and the number of minutes spent working in a part time job per day(Y). The correlation between X and Y is believed to be -0.4. X and Y arebelieved to have a bivariate normal distribution with parametersµx= 50, µy= 300, σx= 10, σy= 30, ρ = −0.4, What is the expectedtotal number of hours spent doing those tasks? What is the variance?What is the distribution?E(X + Y ) = 50 + 300 = 550Var (X + Y ) = 102+ 302+ 2(−0.4)(10)(30) = 760Now, X and Y are not independent (their correlation is not 0). Therefore,the distribution of X + Y can not be found by multiplying the momentgenerating functions and then matching the resulting mgf to the one ofthe known ones.J. Sanchez Sums of Many Random variablesExampleThere is a relationship between daily number of minutes spent texting(X) and the number of minutes spent working in a part time job per day(Y). The correlation between X and Y is believed to be -0.4. X and Y arebelieved to have a bivariate normal distribution with parametersµx= 50, µy= 300, σx= 10, σy= 30, ρ = −0.4, What is the expectedtotal number of hours spent doing those tasks? What is the variance?What is the distribution?E(X + Y ) = 50 + 300 = 550Var (X + Y ) = 102+ 302+ 2(−0.4)(10)(30) = 760Now, X and Y are not independent (their correlation is not 0). Therefore,the distribution of X + Y can not be found by multiplying the momentgenerating functions and then matching the resulting mgf to the one ofthe known ones.J. Sanchez Sums of Many Random variablesII. Linear Combination of many random variables.The results just mentioned for two random variables, generalize to manyrandom variables. Let X1, X2, ....., Xnbe n random variables. Then,E(X1+ X2+ .... + Xn) =nXi=1E(Xi)Var (X1+ X2+ ..... + Xn) =nXi=1Var (Xi) + 2XXi<jCov(Xi, Xj).If the random variables are independent, then their covariances are 0.More generally,Let X1, X2, ....Xnbe n random variables, with E(Xi) = µiandVar (Xi) = σ2iand let aibe constants, ai= 1, ..., n.U =Pni=1aiXifor constants a1, ....an.(a) E(U) =Pni=1aiµi(b) V (U) =Pni=1a2iσ2i+ 2PPi<jaiajCov(Xi, Xj)where the double sum is over all pairs (i, j) with i < jJ. Sanchez Sums of Many Random variablesIf the variables are independent and identically distributed (i.e., samedistribution and same parameter for the distribution), perhaps we canfind the distribution of their sum with moment generating


View Full Document

UCLA STATS 100A - July 23rd-25 Lecture partII (posted 722)

Documents in this Course
Hwk8

Hwk8

3 pages

HW09Key

HW09Key

3 pages

Normal

Normal

7 pages

clt

clt

3 pages

Load more
Download July 23rd-25 Lecture partII (posted 722)
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view July 23rd-25 Lecture partII (posted 722) and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view July 23rd-25 Lecture partII (posted 722) 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?