DOC PREVIEW
UCLA STATS 100A - Lesson-mgf-student

This preview shows page 1-2 out of 7 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

14.11.25!1!Stat 100A/Sanchez-Introduction to Probability!0!The Moment Generating Function. Review and properties that help us find the distributions of linear combinations of independent and identically distributed random variables. Mgf of the normal random variable. "Outline"Recommended supplementary reading: Ross, Section 7.7, example 7i. • Moment generating functions for discrete and continuous random variables (review)!• Properties of a Moment generating function.!• Using the moment generating function to identify the distribution of a sum of iid random variables and of a linear combination of a random variable. !• The Mgf of the normal distribution!Stat 100A/Sanchez-Introduction to Probability!1!Introduction!A very important function of a random variable X is . The expectation of this function of X, , is called the moment generating function of X. The moment generating function is unique for each random variable. It helps us generate expectations of other nonlinear functions of X, such as !Stat 100A/Sanchez-Introduction to Probability!2!€ etx€ E etx( )€ E X3( ), E X4( ),etcStat 100A/Sanchez-Introduction to Probability!3!1. Moment generating function of a discrete random variable!The moment generating function M(t) of any discrete random variable X is defined for all real values of t by:!€ M(t) = E(etx) = etxx∑p(x)Remember the definition of an expected value of a function of a r.v. g(x): !E(g(x)) = Σ ig(xi)p(xi)!Where P(x) is the probability mass function of X.!Stat 100A/Sanchez-Introduction to Probability!4!We call M(t) the moment generating function !because all the moments of X can be obtained by successively differentiating M(t) and then evaluating the result at t=0.!Define moments of X:!!E(x) =1st moment!!E(x2) =2nd moment!!E(x3) =3rd moment!Stat 100A/Sanchez-Introduction to Probability!5!€ ʹ′ M (t) =ddtE etx( )[ ]= Eddtetx( )⎡ ⎣ ⎢ ⎤ ⎦ ⎥ = E xetx( )ʹ′ M (0) = E(x)Example: the first moment of a r.v. X!Interchange of the differentiation and expectation operators is legitimate: !€ ddtetxp(x)x∑⎡ ⎣ ⎢ ⎤ ⎦ ⎥ =ddtetxp(x)[ ]x∑14.11.25!2!Stat 100A/Sanchez-Introduction to Probability!6!Example: the second moment of a r.v. X!€ ʹ′ ʹ′ M (t) =ddtʹ′ M (t)=ddtE (xetx)= Eddtxetx( )= E (x2etx)ʹ′ ʹ′ M (0) = E (x2)Stat 100A/Sanchez-Introduction to Probability!7!In general, the nth derivative of M(t) is given by:!€ Mn(t) = E(xnetx)Mn(0) = E (xn)⎫ ⎬ ⎭ n ≥1Stat 100A/Sanchez-Introduction to Probability!8!Example 1.1 The moment generating function for the binomial distribution: X ~ Bin(n, p)!€ M(t) = E(etx)= etknk⎛ ⎝ ⎜ ⎞ ⎠ ⎟ pk(1− p)n−kk= 0n∑=nk⎛ ⎝ ⎜ ⎞ ⎠ ⎟ ( pet)k(1− p)n−kk= 0n∑= ( pet+ 1− p)nBinomial Theorem: !€ (x + y)n=nk⎛ ⎝ ⎜ ⎞ ⎠ ⎟ xkyn−kk= 0n∑Stat 100A/Sanchez-Introduction to Probability!9!Differentiation yields:!€ ʹ′ M (t) = n( pet+ 1− p)n−1petE(x) =ʹ′ M (0) = npand thus:!Stat 100A/Sanchez-Introduction to Probability!10!2nd moment and variance for X~Bin:!€ ʹ′ M (t) = n( pet+ 1− p)n−1petʹ′ ʹ′ M (t) = n(n −1)( pet+ 1− p)n− 2( pet)2+n( pet+ 1− p)n−1( pet)E(x2) =ʹ′ ʹ′ M (0) = n(n −1)p2+ npVar(x) = E( x2) − E(x)[ ]2= n(n −1) p2+ np − n2p2= np(1− p)And:!Stat 100A/Sanchez-Introduction to Probability!11!€ M(t) = E (etx)= etxe−λλxx!x= 0∞∑= e−λ(λet)xx!x= 0∞∑= e−λeλet= eλ(et−1)Example 1.2 For X a Poisson r.v. with mean λ:!This Result uses the Maclarin polynomial for ek:! € f (x) = f (0) +ʹ′ f (0)x +ʹ′ ʹ′ f (0)x22!+  +fn(0)xnn!Pn=knn!n= 0∑= ek14.11.25!3!Stat 100A/Sanchez-Introduction to Probability!12!€ ʹ′ M (t) =λeteλ(et−1)ʹ′ ʹ′ M (t) =λeteλ(et−1)+ (λet)2eλ(et−1)E (x) =ʹ′ M (0) =λE (x2) =ʹ′ ʹ′ M (0) =λ2+λVar(x) = E (x2) − E (x)[ ]2=λ2+λ−λ2=λ1st and 2nd moments for a Poisson r.v.!Stat 100A/Sanchez-Introduction to Probability!13!Example 1.3 Suppose that the moment generating function of a r.v. X is given by "M(t) = e3(et-1). What is the P(x = 0)?!Stat 100A/Sanchez-Introduction to Probability!14!Example 1.3 Suppose that the moment generating function of a r.v. X is given by "M(t) = e3(et-1). What is the P(x = 0)?!First, identify the probability distribution by looking at the M(t) formula. It looks like this is the mgf of ?!Stat 100A/Sanchez-Introduction to Probability!15!Example 1.3 Suppose that the moment generating function of a r.v. X is given by "M(t) = e3(et-1). What is the P(x = 0)?!Stat 100A/Sanchez-Introduction to Probability!16!2. Moment generating functions of a continuous random variable!The moment generating function M(t) of the continuous random variable X is defined for all real values of t by:!€ M(t) = E(etx)= etx−∞∞∫f (x)dx}If X is continuous with density f(x)!Remember the definition of an expected value of a function of a r.v. g(x): !E(g(x)) = ∫ g(x)f(x)dx!Stat 100A/Sanchez-Introduction to Probability!17!As before, we call M(t) the moment!generating function because all the moments of X can be obtained by successively differentiating M(t) and then evaluating the result at t = 0.!€ ʹ′ M (t) =ddtE (etx) = Eddt(etx) = E (xetx)€ ʹ′ M (t)t= 0= E (x)ʹ′ ʹ′ M (t)t= 0= E (x2)ʹ′ ʹ′ ʹ′ M (t)t= 0= E (x3)So:!14.11.25!4!Stat 100A/Sanchez-Introduction to Probability!18!Example 2.1. The moment generating function of the exponential distribution with parameter λ!€ M(t) = E (etx)= etx0∞∫λe−λxdx=λetx0∞∫e−λxdx =λe−(λ−t )x0∞∫dx=λλ− tFor t < λ!Stat 100A/Sanchez-Introduction to Probability!19!E(x) and Var(x) for the exponential distribution with parameter λ!€ ʹ′ M (t) =λ(λ− t)2ʹ′ ʹ′ M (t) =2λ(λ− t)3Var(x) = E (x2) − E (x)[ ]2=1λ2At t = 0, M’(t) = 1/ lambda = E(x)!At t = 0, M’’(t) = 2/ lambda2 = E(x2)!Stat 100A/Sanchez-Introduction to Probability!20! 3 Properties of moment generating functions. !Let X, Y be independent r.v.’s. The moment generating function of the sum of independent random variables equals the product of the individual moment generating functions Mx(t), My(t). Proof:!€ M(x + y )(t) = E (et( x + y ))= E(etxety)= E(etx)E(ety)= M(x )(t)M(y )(t)3.1 Moment generating function of the sum of independent r.v.s is the product of the r.v’s moment generating functions.


View Full Document

UCLA STATS 100A - Lesson-mgf-student

Documents in this Course
Hwk8

Hwk8

3 pages

HW09Key

HW09Key

3 pages

Normal

Normal

7 pages

clt

clt

3 pages

Load more
Download Lesson-mgf-student
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lesson-mgf-student and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lesson-mgf-student 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?