DOC PREVIEW
UCLA STAT 100A - Moment Generating Function

This preview shows page 1-2-19-20 out of 20 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Moment Generating FunctionStatistics 110Summer 2006Copyrightc°2006 by Mark E. IrwinMoments RevisitedSo far I’ve really only talked about the first two moments. Lets define whatis meant by moments more precisely.Definition. The rthmoment of a random variable X is E[Xr], assumingthat the expectation exists.So the mean of a distribution is its first moment.Definition. The r central moment of a random variable X isE[(X − E[X])r], assuming that the expectation exists.Thus the variance is the 2nd central moment of distribution.The 1st central moment usually isn’t discussed as its always 0.The 3rd central moment is known as the skewness of a distribution and isused as a measure of asymmetry.Moments Revisited 1If a distribution is symmetric about its mean (f(µ − x) = f(µ + x)), theskewness will be 0. Similarly if the skewness is non-zero, the distributionis asymmetric. However it is possible to have asymmetric distribution withskewness = 0.Examples of symmetric distribution are normals, Beta(a, a), Bin(n, p =0.5). Example of asymmetric distributions areDistribution SkewnessBin (n, p) np(1 − p)(1 − 2p)P ois(λ) λExp(λ)2λBeta(a, b) Ugly formulaThe 4th central moment is known as the kurtosis. It can be used as ameasure of how heavy the tails are for a distribution. The kurtosis for anormal is 3σ4.Moments Revisited 2Note that these measures are often standardized as in their raw form theydepend on the standard deviation.Theorem. If the rth moment of a RV is exists, then the sth momentexists for all s < r. Also the sth central moment exists for all s ≤ r.So you can’t have a distribution that has a finite mean, an infinite variance,and a finite skewness.Proof. Postponed till later. 2Why are moments useful? They can be involved in calculating means andvariances of transformed RVs or other summaries of RVs.Example: What are the mean and variance of A = πR2E[A] = πE[R2]Var(A) = π2Var(R2) = π2¡E[R4] − (E[R2])2¢So we need E[R4] in addition to E[R] and E[R2].Moments Revisited 3Example: What is the skewness of X?E[(X − µ)3] = E[X3− 3µX2+ 3µ2X − µ3] = E[X3] − 3µE[X2] + 2µ3so E[X], E[X2], and E[X3] are needed to calculate the skewness.Moments Revisited 4Moment Generating FunctionDefinition. The Moment Generating Function (MGF) of a randomvariable X, is MX(t) = E[etX] if the expectation is defined.MX(t) =XxetxpX(x) (Discrete)MX(t) =ZXetxfX(x)dx (Continuous)Whether the MGF is defined depends on the distribution and the choice oft. For example, the MX(t) is defined for all t if X is normal, defined for not if X is Cauchy, and for t < λ if X ∼ Exp(λ).For those that have done some analysis, for the continuous case, themoment generating function is related to the Laplace transform of thedensity function. Many of the results about it come from that theory.Moment Generating Function 5Why should we care about the MGF?• To calculate moments. It may be easier to work with the MGF than todirectly calculate E[Xr].• To determine distributions of functions of random variables.• Related to this, approximating distributions. For example can use itto show that as n increases, the Bin(n, p) “approaches” a normaldistribution.The following theorems justify these uses of the MGF.Moment Generating Function 6Theorem. If MX(t) of a RV X is finite in an open iterval containing 0,then it has derivatives of all orders andM(r)X(t) = E[XretX]M(r)X(0) = E[Xr]Proof.M(1)X(t) =ddtZ∞−∞etxfX(x)dx=Z∞−∞µddtetx¶fX(x)dx=Z∞−∞xetxfX(x)dx= E[XetX]Moment Generating Function 7M(2)X(t) =ddtM(1)X=Z∞−∞xµddtetx¶fX(x)dx=Z∞−∞x2etxfX(x)dx = E[X2etX]The rest can be shown by induction. The second part of the theorem followsfrom e0= 1. 2Another way to see this result is due to the Taylor series expansion ofey= 1 + y +y22!+y33!+ . . . ,which givesMoment Generating Function 8MX(t) = E·1 + Xt +X2t22!+X3t33!+ . . .¸= 1 + E[X]t + E[X2]t22!+ E[X3]t33!+ . . .Example MGFs:• X ∼ U(a, b)MX(t) =Zbaetxb − adx =ebt− eat(b − a)tMoment Generating Function 9• X ∼ Exp(λ)MX(t) =Z∞0etxλe−λxdx =Z∞0λe−(λ−t)xdx =λλ − tNote that this integral is only defined when t < λ• X ∼ Geo(p), (q = 1 − p)MX(t) =∞Xx=1etxpqx−1= pet∞Xx=1(et)x−1qx−1=pet1 − qet• X ∼ P ois(λ)MX(t) =∞Xx=0etxe−λλxx!= e−λ∞Xx=0(etλ)xx!= eλ(et−1)Moment Generating Function 10Examples of using the MGF to calculate moments• X ∼ Exp(λ)M(1)(t) =λ(λ − t)2; E[X] =1λM(2)(t) =2λ(λ − t)3; E[X2] =2λ2M(r)(t) =Γ(r + 1)λ(λ − t)r+1; E[Xr] =Γ(r + 1)λrMoment Generating Function 11• X ∼ Geo(p), (q = 1 − p)M(1)(t) =pet1 − qet+pqe2t(1 − qet)2; E[X] =1pM(2)(t) =pet1 − qet+3pqe2t(1 − qet)2+2pq2e3t(1 − qet)3; E[X2] =5 − 6p + 2p2pTheorem. If Y = a + bX thenMY(t) = eatMX(bt)Proof.MY(t) = E[etY] = E[eat+btX] = eatE[e(bt)X] = eatMX(bt)2Moment Generating Function 12For example, this result can be used to verify the result thatE[a + bX] = a + bE[X] asM(1)Y(t) = aeatMX(bt) + beatM(1)X(bt)M(1)Y(0) = aMX(0) + bM(1)X(0) = a + bE[X]Theorem. If X and Y are independent RVs with MGFs MXand MYandZ = X + Y , then MZ(t) = MX(t)MY(t) on the common interval whereboth MGFs exist.Proof.MZ(t) = E[etZ] = E[et(X+Y )] = E[etXetY]= E[etX]E[etY] = MX(t)MY(t)2By induction, this result can be extended to sums of many independentRVs.Moment Generating Function 13One particular use of this result is that it can give an easy approach toshowing what the distribution of a sum of RVs is without having thecalculate the convolution of the densities. But first we need one moreresult.Theorem. [Uniqueness theorem] If the MGF of X exists for t in anopen interval containing 0, then it uniquely determines the CDF.i.e no two different distributions can have the same values for the MGFs onan interval containing 0.Proof. Postponed 2Example: Let X1, X2, . . . , Xnbe iid Exp(λ). What is the distribution ofS =PXiMS(t) =nYi=1λλ − t=µλλ − t¶nMoment Generating Function 14Note that this isn’t the form of the MGF for an exponential, so the sumisn’t exponential. As shown in Example B on page 145, the MGF of aGamma(α, λ) isM(t) =µλλ − t¶αso S ∼ Gamma(n, λ)This approach also leads to an easy proof that the sum of independentnormals is also normal. The moment generating function for N(µ, σ2) RVis M(t) = eµt+σ2t2/2. So if Xiiid∼ N(µi, σ2i), i = 1, . . . , n, thenMPXi=nYi=1eµit+σ2it2/2= exp(tXµi+ t2/2Xσ2i)which is the moment generating function of a


View Full Document

UCLA STAT 100A - Moment Generating Function

Download Moment Generating Function
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Moment Generating Function and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Moment Generating Function 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?