NCSSM Statistics Leadership Institute Notes The Theory of InferenceMoment Generating Functions and Their PropertiesThe ith moment of a random variable Y is defined to be E Yii( ) =′µ . So the expectedvalue or mean of Y, EY(), is the first moment E Y( )1. The expected value of Y2,E Y( )2, which can be used to find the variance of Y, is the second moment.The moment generating function mt() for a random variable Y is defined to beE etY( ) where t is in a small neighborhood of zero. Som t E e E tYtY tYtY( ) ( )( )!( )!...= = + + + +FHGIKJ12 32 3 since the series expansion fore tYtY tYtY= + + + +12 32 3( )!( )!.... Also, ′ = + + +FHGIKJm t E YtY t Y( )! !...22332 2 3. Setting t=0,we have ′=+++=m E Y E Y( ) ( ...) ( )0 0 0 . So, the first derivative of the momentgenerating function evaluated at t=0 is the expected value of Y. That is,′==m E Y( ) ( )0µ. Thus, if you have the moment generating function for a randomvariable, you can find µ by taking the first derivative and evaluating it at zero.Now look at the second derivative:′′ = +⋅+FHGIKJm t E YtY( )!...233 23, and ′′=m E Y( ) ( )02.Since V Y E Y E Y( ) ( ) [ ( )]= −2 2, we can use the first and second derivatives of the momentgenerating function to find the variance of a random variable. Before proceeding we willverify that V Y E Y E Y( ) ( ) [ ( )]= −2 2: V Y E Y E Y Y( ) ( ) ( )= − = − +µ µ µ2 2 22 = − += − += − += − += − = −E Y E Y EE Y E YE YE YE Y E Y E Y( ) ( ) ( )( ) [ ( )]( ) ( )( )( ) ( ) [ ( )]2 22 22 22 2 22 2 2 22222µ µµ µµ µ µµ µµThe argument given above essentially repeats the calculation for (),ijCovYY on page 7.Covariance is more general, since ()(),CovYYVarY= .Note that if Y is a discrete random variable,NCSSM Statistics Leadership Institute Notes The Theory of Inference11222()()()()()1...2! ()()()()...2!tYtyYtymtEeepypytytpytypyypy===+++=+++∑∑∑∑∑2222212 1()()... since is constant in the sum2! 1()()...2! 1...2!ttypyypytttEyEyttµµ=+++=+++′′=+++∑∑as we would expect since this is a moment generating function.Example 1Suppose Y is a Poisson random variable with parameter λ. Then p yeyy( )!=−λλ fory=012,,...., where λ represents the rate at which something happens. Find the momentgenerating function and use it to find the mean and variance of Y.Solution:First find m t E ee eyYtYty yy( ) ( )!= =−=∞∑λλ0 = =−=∞−∑eeye et yyetλ λ λλ( )!0 since ( )!eyt yyλ=∞∑0 is the power series for eetλ ==− −eee et tλ λ λ ( )1 ∴ =−m t eYet( )( )λ 1Find derivatives of m tY( ) to use for computing µand σ2.′= ⋅−m t e ee tt( )( )λλ1′′= +− −m t e e e e ee t t e tt t( )( ) ( )λ λλ λ λ1 1E Y m e e( ) ( )= =′= =⋅µ λ λλ00 0E Y m e e e e e( ) ( )2 0 0 0 0 0 20=′′= + = +λ λ λ λ λV Y E Y E Y( ) ( ) [ ( )]= = −σ2 2 2 = + − =( )λ λ λ λ2 2The Poisson distribution has mean and variance of λ.Example 2Suppose Y has a geometric distribution with parameter p. Show that the momentgenerating function for Y is m tpeqett( ) =−1 where qp=−1.NCSSM Statistics Leadership Institute Notes The Theory of Inference12Solution:( )111111() 1, 2,...()()() Note: geometric series with common ratio 1ifln11ytYtyyYytytyttttpyqpymtEeepqpqeqeqtqqepepqqeqe−∞−=∞−=−=====<<−==−−∑∑Now that we know the moment generating function, it is a simple matter to find the meanand variance of the geometric distribution.For the mean we have( )()()( ) ( )22111tttttttqepepeqepemtqeqe−−−′==−−, so ( )( )0220101pepmppqe′===−.For the variance we have( )()( )()()( )()( )243121111tttttttttqepepeqeqepeqemtqeqe−−−−+′′==−−and( )()( )( )00332011101peqepqqmppqe+++′′===−,so ( ) ( ) ( )( )222211100qpVarYmmppp+−′′′=−=−=.Example 3Find the moment generating function for a random variable with a standard normaldistribution. That is, find the moment generating function for ZN~(,)01. Note thatZN~(,)01is read: Z is distributed as a normal random variable with µ=0 and σ21= .Solution:m t E e e e dzZtZ tzz( ) ( )= =−∞∞−z1222π =−∞∞−+ − +FHGIKJz122 2 2222 2 2πe dzz tz t tNCSSM Statistics Leadership Institute Notes The Theory of Inference13 =−∞∞− −ze e dzt z t2 22 212π( ) = = =−∞∞ze normal density t dzt22 21 , ,µ σd i =⋅=eet t2 22 21It is straight forward to verify that the mean and variance are 0 and 1, respectively.Example 4Show that the moment generating function for a random variable Y N~ ( , )µ σ2is222()ttYmteσµ += . Use the moment generating function to show that EY()=µ.Solution:Following the outline of Example 3, we havem t E e e e dyYtY tyy( ) ( )( )= =−∞∞− −z121222πσσµ =−∞∞− − +z1212222222πσσµσσ e dyyty( )Consider the exponent:− − −12222 2σµ σ[( ) ]y ty = − − − − −12222 2 2 4 2σµ σ σ µ σ[(( ) ) ]y t t tSo m t e dyYy t tt( )[( ) ]=−∞∞ −− − + +z1212222 22 2πσσµ σ µσ =+−∞∞ −− −ze e dytty tµσσµ σπσ2 222 221212 [( ) ] (the integrand is a normal density function) = = + =+−∞∞ze normal density t dyttµσµ σ σ2 222 2( , ,var ) mean =⋅=+ +eettttµσµσ2 2 2 22 21Now to find the expected value of Y: ( )2222()ttmtetσµµσ+′=+ ′= + =m e( ) ( )0 00µ µ .The variance of Y is found by computing: ( ) ( )222222222()ttttmteetσσµµσµσ++′′=++NCSSM Statistics Leadership Institute Notes The Theory of Inference14 ( )()( )2020220meeσµσµ′′=+=+.Then ()2222Varyσµµσ=+−= .A more direct derivation can also be given:()( )()( )()( )( )2221122()ttttztztYtttYZmtEeEeEeeemteeeσµσµσσµµµσ++===⋅===.Example 5Find the moment generating function for a random variable with a gamma distribution.Solution:1/01()()()tYytyYmtEeyeedyαβαβα∞−−==Γ∫ =−− −∞z1110β αααβΓ( )( )y e dyy t (Recall that αβ and ()αΓ are constants)Rewrite the expression 1111tttβββββ−−==− =−−−FHGIKJ∞z1110β αααββΓ( )y e dyytSince we have previously shown that y e dyyα β αβ α− −∞=z10/( )Γ,=−FHGIKJ11β αββαααΓΓ( )( )t =−=−FHGIKJ1111( )β βααt t, if 1tβ<.Recall that the chi-square distribution is a special case of the gamma distribution withβ=2 and αν=/ 2. So it follows from Example 5 that if 2~ ()Yχν, thenm ttY( )/=−FHGIKJ11 22ν.From this moment generating function we can find
View Full Document