Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Chapter 7Point Estimation of Parameters and Sampling DistributionsApplied Statistics and Probability for EngineersSixth EditionDouglas C. Montgomery George C. RungerCopyright © 2014 John Wiley & Sons, Inc. All rights reserved.Maximum Likelihood Estimators• Suppose that X is a random variable with probability distribution f(x;θ), where θ is a single unknown parameter. Let x1, x2, …, xnbe the observed values in a random sample of size n. Then the likelihood function of the sample is:L(θ) = f(x1;θ) ∙ f(x2; θ) ∙…∙ f(xn; θ)• Note that the likelihood function is now a function of only the unknown parameter θ. The maximum likelihood estimator (MLE) of θ is the value of θ that maximizes the likelihood function L(θ).Sec 7-4.2 Method of Maximum Likelihood 2Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-10: Bernoulli Distribution MLELet X be a Bernoulli random variable. The probability mass function is f(x;p) = px(1-p)1-x, x = 0, 1 where P is the parameter to be estimated. The likelihood function of a random sample of size n is:Sec 7-4.2 Method of Maximum Likelihood 3 1212111 1 11111111 1 ... 111ln ln ln 1ln1nnnniiiiiix x xxxxnxx n xxinniiiinniiiiL p p p p p p pp p p pL p x p n x pnxxd L pdp p p Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-11: Normal Distribution MLE for μLet X be a normal random variable with unknown mean μ and known variance σ2. The likelihood function of a random sample of size n is:Sec 7-4.2 Method of Maximum Likelihood 4 2222121122222212112121ln ln 222ln1iniinxixnniiniiLeenLxdLxd Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-12: Exponential Distribution MLELet X be a exponential random variable with parameter λ. The likelihood function of a random sample of size n is:Sec 7-4.2 Method of Maximum Likelihood 5 1111ln lnlnniiinxxniniiniiL e eL n xdLnxd 1Equating the above to zero we get1 (same as moment estimator)niin x XCopyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-13: Normal Distribution MLEs for μ & σ2Let X be a normal random variable with both unknown mean μand variance σ2. The likelihood function of a random sample of size n is:Sec 7-4.2 Method of Maximum Likelihood6 222212211222222212212224212211,2121ln , ln 222ln ,10ln ,1022 and iniinxixnniiniiniiniiLeenLxLxLnxxXXn Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Properties of an MLESec 7-4.2 Method of Maximum Likelihood 7Notes:• Mathematical statisticians will often prefer MLEs because of these properties. Properties (1) and (2) state that MLEs are MVUEs.• To use MLEs, the distribution of the population must be known or assumed.Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Invariance PropertySec 7-4.2 Method of Maximum Likelihood 8This property is illustrated in Example 7-13.Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-14: InvarianceFor the normal distribution, the MLEs were:Sec 7-4.2 Method of Maximum Likelihood 9Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Complications of the MLE MethodThe method of maximum likelihood is an excellent technique, however there are two complications:1. It may not be easy to maximize the likelihood function because the derivative function set to zero may be difficult to solve algebraically.2. It may not always be possible to use calculus methods directly to determine the maximum of L(ѳ).The following example illustrate this.Sec 7-4.2 Method of Maximum Likelihood 10Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-16: Gamma Distribution MLE-1Let X1, X2, …, Xnbe a random sample from a gamma distribution. The log of the likelihood function is:Sec 7-4.2 Method of Maximum Likelihood 11 111111ln , lnln 1 ln lnln , 'ln lnln ,ixrrniinniiiiniiniixeLrrnr r x n r xL r rn x nrrLrnrx Equating the above derivative to zero we getCopyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-16: Gamma Distribution MLE-2Sec 7-4.2 Method of Maximum Likelihood 12Figure 7-11 Log likelihood for the gamma distribution using the failure time data. (a) Log likelihood surface. (b) Contour plot.Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Bayesian Estimation of Parameters-1• The moment and likelihood methods interpret probabilities as relative frequencies and are called objective frequencies.• The random variable X has a probability distribution of parameter θ called f(x|θ). • Additional information about θ is that it can be summarized as f(θ), the prior distribution, with mean μ0and variance σ02. Probabilities associated with f(θ) are subjective probabilities.• The joint distribution is f(x1, x2, …, xn|θ).• The posterior distribution is f(θ|x1, x2, …, xn) is our degree of belief regarding θ after observing the sample data.7-4.3 Bayesian Estimation of Parameters 13Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Bayesian Estimation of Parameters-2• Now the joint probability distribution of the sample is f(x1, x2, …, xn, θ) = f(x1, x2, …, xn |θ) ∙ f(θ) • The marginal distribution is:• The desired posterior distribution is:7-4.3 Bayesian
View Full Document