DOC PREVIEW
UT Dallas CS 6313 - ch07-3

This preview shows page 1-2-3-4-5-6 out of 19 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Chapter 7Point Estimation of Parameters and Sampling DistributionsApplied Statistics and Probability for EngineersSixth EditionDouglas C. Montgomery George C. RungerCopyright © 2014 John Wiley & Sons, Inc. All rights reserved.Maximum Likelihood Estimators• Suppose that X is a random variable with probability distribution f(x;θ), where θ is a single unknown parameter. Let x1, x2, …, xnbe the observed values in a random sample of size n. Then the likelihood function of the sample is:L(θ) = f(x1;θ) ∙ f(x2; θ) ∙…∙ f(xn; θ)• Note that the likelihood function is now a function of only the unknown parameter θ. The maximum likelihood estimator (MLE) of θ is the value of θ that maximizes the likelihood function L(θ).Sec 7-4.2 Method of Maximum Likelihood 2Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-10: Bernoulli Distribution MLELet X be a Bernoulli random variable. The probability mass function is f(x;p) = px(1-p)1-x, x = 0, 1 where P is the parameter to be estimated. The likelihood function of a random sample of size n is:Sec 7-4.2 Method of Maximum Likelihood 3               1212111 1 11111111 1 ... 111ln ln ln 1ln1nnnniiiiiix x xxxxnxx n xxinniiiinniiiiL p p p p p p pp p p pL p x p n x pnxxd L pdp p p                       Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-11: Normal Distribution MLE for μLet X be a normal random variable with unknown mean μ and known variance σ2. The likelihood function of a random sample of size n is:Sec 7-4.2 Method of Maximum Likelihood 4          2222121122222212112121ln ln 222ln1iniinxixnniiniiLeenLxdLxd    Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-12: Exponential Distribution MLELet X be a exponential random variable with parameter λ. The likelihood function of a random sample of size n is:Sec 7-4.2 Method of Maximum Likelihood 5     1111ln lnlnniiinxxniniiniiL e eL n xdLnxd    1Equating the above to zero we get1 (same as moment estimator)niin x XCopyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-13: Normal Distribution MLEs for μ & σ2Let X be a normal random variable with both unknown mean μand variance σ2. The likelihood function of a random sample of size n is:Sec 7-4.2 Method of Maximum Likelihood6               222212211222222212212224212211,2121ln , ln 222ln ,10ln ,1022 and iniinxixnniiniiniiniiLeenLxLxLnxxXXn          Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Properties of an MLESec 7-4.2 Method of Maximum Likelihood 7Notes:• Mathematical statisticians will often prefer MLEs because of these properties. Properties (1) and (2) state that MLEs are MVUEs.• To use MLEs, the distribution of the population must be known or assumed.Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Invariance PropertySec 7-4.2 Method of Maximum Likelihood 8This property is illustrated in Example 7-13.Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-14: InvarianceFor the normal distribution, the MLEs were:Sec 7-4.2 Method of Maximum Likelihood 9Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Complications of the MLE MethodThe method of maximum likelihood is an excellent technique, however there are two complications:1. It may not be easy to maximize the likelihood function because the derivative function set to zero may be difficult to solve algebraically.2. It may not always be possible to use calculus methods directly to determine the maximum of L(ѳ).The following example illustrate this.Sec 7-4.2 Method of Maximum Likelihood 10Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-16: Gamma Distribution MLE-1Let X1, X2, …, Xnbe a random sample from a gamma distribution. The log of the likelihood function is:Sec 7-4.2 Method of Maximum Likelihood 11                111111ln , lnln 1 ln lnln , 'ln lnln ,ixrrniinniiiiniiniixeLrrnr r x n r xL r rn x nrrLrnrx       Equating the above derivative to zero we getCopyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-16: Gamma Distribution MLE-2Sec 7-4.2 Method of Maximum Likelihood 12Figure 7-11 Log likelihood for the gamma distribution using the failure time data. (a) Log likelihood surface. (b) Contour plot.Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Bayesian Estimation of Parameters-1• The moment and likelihood methods interpret probabilities as relative frequencies and are called objective frequencies.• The random variable X has a probability distribution of parameter θ called f(x|θ). • Additional information about θ is that it can be summarized as f(θ), the prior distribution, with mean μ0and variance σ02. Probabilities associated with f(θ) are subjective probabilities.• The joint distribution is f(x1, x2, …, xn|θ).• The posterior distribution is f(θ|x1, x2, …, xn) is our degree of belief regarding θ after observing the sample data.7-4.3 Bayesian Estimation of Parameters 13Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Bayesian Estimation of Parameters-2• Now the joint probability distribution of the sample is f(x1, x2, …, xn, θ) = f(x1, x2, …, xn |θ) ∙ f(θ) • The marginal distribution is:• The desired posterior distribution is:7-4.3 Bayesian


View Full Document

UT Dallas CS 6313 - ch07-3

Documents in this Course
ch09-01

ch09-01

24 pages

ch08-2

ch08-2

19 pages

ch08-1

ch08-1

17 pages

ch07-3

ch07-3

19 pages

ch07-2

ch07-2

11 pages

ch04

ch04

51 pages

ch02

ch02

50 pages

ch01

ch01

28 pages

ch11-3

ch11-3

26 pages

ch11-2

ch11-2

17 pages

ch11-1

ch11-1

13 pages

ch10-02

ch10-02

29 pages

ch10-01

ch10-01

28 pages

ch09-04

ch09-04

22 pages

ch09-03

ch09-03

17 pages

ch09-02

ch09-02

22 pages

ch11-3

ch11-3

26 pages

ch11-2

ch11-2

17 pages

ch11-1

ch11-1

13 pages

ch10-02

ch10-02

29 pages

ch10-01

ch10-01

28 pages

ch09-04

ch09-04

22 pages

ch09-03

ch09-03

17 pages

ch09-02

ch09-02

22 pages

ch09-01

ch09-01

24 pages

ch08-2

ch08-2

19 pages

ch08-1

ch08-1

17 pages

ch07-2

ch07-2

11 pages

ch04

ch04

51 pages

ch02

ch02

50 pages

ch01

ch01

28 pages

PS-10

PS-10

18 pages

PS-9

PS-9

14 pages

PS-7

PS-7

11 pages

PS-6

PS-6

12 pages

PS-5

PS-5

8 pages

PS-4

PS-4

8 pages

probs 2-3

probs 2-3

17 pages

ch09-02

ch09-02

22 pages

ch09-01

ch09-01

24 pages

ch08-2

ch08-2

19 pages

ch08-1

ch08-1

17 pages

ch07-3

ch07-3

19 pages

ch07-2

ch07-2

11 pages

ch04

ch04

51 pages

ch02

ch02

50 pages

ch01

ch01

28 pages

PS-10

PS-10

18 pages

PS-4

PS-4

8 pages

probs 2-3

probs 2-3

17 pages

ch11-3

ch11-3

26 pages

ch11-2

ch11-2

17 pages

ch11-1

ch11-1

13 pages

ch10-02

ch10-02

29 pages

ch10-01

ch10-01

28 pages

ch09-04

ch09-04

22 pages

ch09-03

ch09-03

17 pages

SCAN0004

SCAN0004

12 pages

SCAN0001

SCAN0001

12 pages

Prob9

Prob9

12 pages

prob10

prob10

3 pages

Load more
Download ch07-3
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view ch07-3 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view ch07-3 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?