DOC PREVIEW
UCLA STATS 100A - sums-of-rv

This preview shows page 1-2 out of 5 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Exp ectation, Variance and Distribution of Sums ofMany Random Variables.Juana [email protected] Department of StatisticsJ. Sanchez Sums of Many Random variablesTodayLinear combinations of two random variables.Linear combinations of several random variables. Expectation andVariance. Independence case. Distribution for sum of independentrandom variables.J. Sanchez Sums of Many Random variablesI. Linear Combination of Two Random Variables.Applications of probability often call for the use of linearcombinations of several random variables.A study of downtimes of computer systems might require knowledgeof the sum of the downtimes over a day or a week.The total cost of a building project can be studied as the sum of thecosts for the major components of the project.The size of an animal population can be modeled as the sum of thesizes of the colonies within the population.J. Sanchez Sums of Many Random variablesExp ected Value and Variance of the sum of two randomvariablesIf X and Y are two random variables, we see when we study bivariatedistributions thatE(X + Y ) = E (X ) + E(Y )Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X , Y )where Cov(X, Y ) = E [(X − µx)(Y − µy)] = ρσxσyWhen the two random variables are independent, the covariance is 0.Also, for two random variables X1and X2, U = aX1+ bX2, a, b,constants,(a) E(U) = aE (X1) + bE(X2)(b) Var(U) = a2Var(X1) + b2Var(X2) + 2abCov(X1, X2)J. Sanchez Sums of Many Random variablesDistribution of the sum of two familiar random variablesWe also see with the help of moment generating functions, that for somedistributions, we can use the moment generating function to determinethe distribution of the sum of 2 independent random variables. Themoment generating function of the sum of two independent randomvariables is the product of the moment generating function. (Seecomplementary notes on the moment generating function).If X and Y are Poisson random variables with parameters λ1, λ2,respectively, then X + Y is Poisson with parameter λ1+ λ2.If X is a Normal r.v. with parameters µ1, σ1and Y is a Normal r.v.with parameters µ2, σ2then X + Y is normal with mean µ1+ µ2,and variance σ21+ σ22If X is Binomial(n1, p) and Y is Binomial(n2, p), with p the same forX and Y, then X + Y is Binomial (n1+ n2, p).J. Sanchez Sums of Many Random variablesExampleThere is a relationship between daily number of minutes spent texting(X) and the number of minutes spent working in a part time job per day(Y). The correlation between X and Y is believed to be -0.4. X and Y arebelieved to have a bivariate normal distribution with parametersµx= 50, µy= 300, σx= 10, σy= 30, ρ = −0.4, What is the expectedtotal number of hours spent doing those tasks? What is the variance?What is the distribution?E(X + Y ) = 50 + 300 = 350Var(X + Y ) = 102+ 302+ 2(−0.4)(10)(30) = 760Now, X and Y are not independent (their correlation is not 0). Therefore,the distribution of X + Y can not be found by multiplying the momentgenerating functions and then matching the resulting mgf to the one ofthe known ones.J. Sanchez Sums of Many Random variablesI I. Linear Combination of many random variables.The results just mentioned for two random variables, generalize to manyrandom variables. Let X1, X2, ....., Xnbe n random variables. Then,E(X1+ X2+ .... + Xn) =nXi=1E(Xi)Var(X1+ X2+ ..... + Xn) =nXi=1Var(Xi) + 2XXi<jCov(Xi, Xj).If the random variables are independent, then their covariances are 0.More generally,Let X1, X2, ....Xnbe n random variables, with E(Xi) = µiandVar(Xi) = σ2iand let aibe constants, ai= 1, ..., n.U =Pni=1aiXifor constants a1, ....an.(a) E(U) =Pni=1aiµi(b) V (U) =Pni=1a2iσ2i+ 2PPi<jaiajCov(Xi, Xj)where the double sum is over all pairs (i, j) with i < jJ. Sanchez Sums of Many Random variablesIf the variables are independent and identically distributed (i.e., samedistribution and same parameter for the distribution), perhaps we can findthe distribution of their sum with moment generating functions, if theyare among the ones we have studied (Sums of normals are normal, sumsof poissons are Poisson, for example, if the variables are independent ).J. Sanchez Sums of Many Random variablesExampleExampleLet X ∼ U(a, b)(a) Show that E(X ) =b+a2(b) Let X1, X2, ..., Xnbe identically distributed random variables, eachXi∼ U(a, b) i = 1, ..., n Is the sum of uniforms also a uniformrandom variable? Show using moment generating functions.J. Sanchez Sums of Many Random variablesexampleExampleLet X ∼ U(a, b)(a) Show that E(X ) =b+a2E(X ) =Zbax1b − adx =b + a2(b) Let X1, X2, ..., Xnbe identically distributed random variables, eachXi∼ U(a, b) i = 1, ..., n Is the sum of uniforms also a uniformrandom variable? Show using moment generating functions.Mx(t) =etb− etat(b − a)MX1+X2+....+Xn=etb− etat(b − a)nThis is not the mgf of a uniform random variable.J. Sanchez Sums of Many Random variablesExampleSuppose that each person who logs onto an online retailer’s website onMemorial Day weekend buys a random quantity exponential distributedwith parameter λ = 5. If 1000 persons log onto this website on memorialday, what is the expected value and the standard deviation of the amountbought by all 1000 persons?We can assume that people logging onto an online retailer areindependent. So let X1, X2, ..., X1000represent the amount bought byeach person. These are random variables. E (Xi) =1λand Var(Xi) =1λ2.E(X1+ X2+ ....., +Xn) = E (X1) + E (X2) + .... + E (Xn) = 1000(15) = 200Var(X1+ X2+ ....., +Xn) = Var(X1) + Var(X2) + .... + Var(Xn) =1000(152) = 40σX1+X2+....+Xn=pVar(X1+ X2+ ....., +Xn) =√40 = 6.3245We can try to find the distribution of the sum..J. Sanchez Sums of Many Random variablesAn exponential random variable has mgfMx(t) =λλ −tUsing property of the mgf of a sum of independent random variables,MX1+X2+....,+XnisMX1+X2+....,+Xn= MX1MX2......MXn=λλ −tnLooking at the tables of distributions, what distribution has this momentgenerating function?Gamma(n, λ).J. Sanchez Sums of Many Random variablesExampleLet X be the time that Alice waits for a traffic light to turn green, let Ybe the time (at a different intersection) that Antonio waits for a trafficlight to turn green and let W be the time that Jong Ho waits in adifferent intersection for the traffic light to turn green. All times are inminutes. Suppose that X , Y and Y have joint densityf (x, y , w) = 15e−3x−5y −wx > 0, y > 0, w > 0Let T=X+Y+W, represent the total time


View Full Document

UCLA STATS 100A - sums-of-rv

Documents in this Course
Hwk8

Hwk8

3 pages

HW09Key

HW09Key

3 pages

Normal

Normal

7 pages

clt

clt

3 pages

Load more
Download sums-of-rv
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view sums-of-rv and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view sums-of-rv 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?