Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Chapter 7Point Estimation of Parameters and Sampling DistributionsApplied Statistics and Probability for EngineersSixth EditionDouglas C. Montgomery George C. RungerCopyright © 2014 John Wiley & Sons, Inc. All rights reserved.Standard Error of an EstimatorSec 7-3.3 Standard Error Reporting a Point Estimate 2Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-5: Thermal Conductivity• These observations are 10 measurements of thermal conductivity of Armco iron.• Since σ is not known, we use s to calculate the standard error.• Since the standard error is 0.2% of the mean, the mean estimate is fairly precise. We can be very confident that the true population mean is 41.924 ± 2(0.0898) or between 41.744 and 42.104.Sec 7-3.3 Standard Error Reporting a Point Estimate 3xi41.6041.4842.3441.9541.8642.1841.7242.2641.8142.0441.924 = Mean0.284= Std dev (s )0.0898 = Std errorCopyright © 2014 John Wiley & Sons, Inc. All rights reserved.Mean Squared ErrorConclusion: The mean squared error (MSE) of the estimator is equal to the variance of the estimator plus the bias squared.Sec 7-3.4 Mean Squared Error of an Estimator 4Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Relative Efficiency• The MSE is an important criterion for comparing two estimators.• If the relative efficiency is less than 1, we conclude that the 1stestimator is superior than the 2ndestimator.Sec 7-3.4 Mean Squared Error of an Estimator 5Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Optimal Estimator• A biased estimator can be preferred than an unbiased estimator if it has a smaller MSE.• Biased estimators are occasionally used in linear regression.• An estimator whose MSE is smaller than that of any other estimator is called an optimal estimator.Sec 7-3.4 Mean Squared Error of an Estimator 6Figure 7-8 A biased estimator that has a smaller variance than the unbiased estimator .Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Moments Defined• Let X1, X2,…,Xnbe a random sample from the probability distribution f(x), where f(x) can be either a:– Discrete probability mass function, or– Continuous probability density function• The kthpopulation moment (or distribution moment) is E(Xk), k = 1, 2, ….• If k = 1 (called the first moment), then:– Population moment is μ.– Sample moment is x-bar.• The sample mean is the moment estimator of the population mean.Sec 7-4.1 Method of Moments 7 1The k sample moment is , 1,2,.. 1/ .nkiithn XkCopyright © 2014 John Wiley & Sons, Inc. All rights reserved.Moment EstimatorsSec 7-4.1 Method of Moments 8Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-8: Normal Distribution Moment EstimatorsSuppose that X1, X2, …, Xnis a random sample from a normal distribution with parameter μ and σ2where E(X) = μ and E(X2) = μ2+ σ2.Sec 7-4.1 Method of Moments 9 2 2 2112221122122121111 and 111 (biased)nniiiinniiniiiinniiniiiiX X XnnX n XnXXnnXXXXn n n Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-9: Gamma Distribution Moment Estimators-1Sec 7-4.1 Method of Moments 10 222222221221 is the mean is the variance or1 and now solving for and :1/1/niiniirE X XrE X E XrrE X rXrn X XXn X XSuppose that X1, X2, …, Xnis a random sample from a gamma distribution with parameter r and λ where E(X) = r/ λ and E(X2) = r(r+1)/ λ2.Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.Example 7-9: Gamma Distribution Moment Estimators-2Using the time to failure data in the table. We can estimate the parameters of the gamma distribution.Sec 7-4.1 Method of Moments 11 222221222121.6461.291 8 6645.4247 21.6461/21.6460.05981 8 6645.4247 21.6461/niiniiXrn X XXn X X xixi211.96 143.04165.03 25.300967.40 4542.760016.07 258.244931.50 992.25007.73 59.752911.10 123.210022.38 500.8644x-bar = 21.646ΣX2 =
View Full Document