Econ 3120 1st Edition Exam 1 Study Guide Lectures 1 12 1 Random sampling This lecture discusses how to estimate the mean from a population of interest Normally it is impractical or impossible to examine the whole population If we could do this we could simply take the mean of the population and we d be done Given that we can only examine a sample we have to use statistical inference to 1 estimate the parameters that we care about in this case it s the mean and 2 test hypotheses about these parameters To estimate the mean or any other parameter of interest we ll focus on random samples from the population Formally a random sample is a set of independent and identically distributed i i d random variables Y1 Y2 Yn that share a probability density function f y For the first part of the lecture we re going to assume that the population is distributed Normal 2 an assumption we will relax when we discuss large sample properties later in the lecture 2 Estimators Once we have our random sample we can use it to estimate the mean or any other parameter of interest An estimator is a rule or a function that uses the outcome of random sampling to assign a value to the parameter based on the sample h Y1 Y2 Yn For the mean the most obvious estimator is the sample mean or sample average Y 1 n n i 1 Yi 1We often write estimates by putting a hat on top of the parameter i e 1 Y Note that this is just one possible way to estimate the mean We could estimate the mean by simply looking at the first observation i e 2 Y1 In fact an estimator doesn t even need to depend on the random sample at all 3 4 is a perfectly valid estimator We ll see though that the sample mean is preferable because it is unbiased and efficient Estimators like the samples they come from have their own distributions called sampling distributions Estimators have distributions since they are simply functions of realizations of random variables and we have seen that functions of random variables have distributions 2 1 Unbiasedness An estimator is unbiased if its expectation equals the parameter of interest that is E The bias of an estimator is computed as the difference between the expectation and the true parameter Bias E Example Show that 1 Y and 2 Y1 are unbiased estimates of the mean and that 3 4 is biased 2 2 Sample variance and sampling variance of estimators In order to conduct inference that is say something about how accurate our estimator is we need to be able to estimate the variance of a population We ll start by introducing the sample variance an unbiased estimator of the population variance This is given by s 2 1 n 1 n i 1 Yi Y 2 2Note that in order for it to be unbiased we need to divide by n 1 instead of n The reason for this is a bit subtle but it comes from the fact that Y is an estimate and not the true parameter If we knew then an unbiased estimate of the variance would be given by 1 n Yi 2 As suggested above the sampling variance is the variance of an estimator which is based on a sample Example What is the sampling variance of our estimators 1 Y 2 Y1 and 3 4 The sample analog of the standard deviation for estimators is called standard error which we denote se q Var Note that if we have the sampling variance of Y we can fully characterize its distribution Since Y is simply a linear combination of normally distributed random variables its distribution will be Normal 2 n Note that this implies that Y n N 0 1 a fact that we will use later on 2 3 Efficiency The relative efficiency of an estimator is a measure of how close our estimate will be to the true parameter We measure efficiency by comparing the variances Suppose we have two unbiased estimators 1 and 2 The estimate 1 is more efficient when Var 1 Var 2 Example Which is a more efficient estimator of the mean Y or Y1 We usually care about unbiasedness first then efficiency That is we typically compare efficiency among unbiased estimators 3 4 has variance of 0 but it is not necessarily preferable to 1 Y since 3 will almost certainly be biased 3 Large sample Properties of Estimators Now let s introduce a few concepts that help us describe the behavior of estimators as the sample size becomes large For our purposes we ll define large as n 30 although it depends on the 3estimator and how accurate you want to be 3 1 Consistency Consistency tells us whether an estimator converges to the true parameter as the sample size grows large An estimator is consistent if for every 0 P 0 as n A shorthand way of saying this is plim If is consistent we can say it converges in probability to The formula above is a bit complicated but it implies that an estimator becomes arbitrarily close to the true parameter as the sample size grows large Arbitrarily close implies that you can set to be as small as you want and if the estimator is consistent at a large enough sample size you will get within of the true parameter One important consistency result is the law of large numbers If Y1 Y2 Ynare i i d random variables with mean then plim Y Note that unbiasedness and consistency are related concepts but one does not necessarily imply the other Y1 is unbiased but not consistent and we ll see below that there are other estimators that are consistent but not unbiased 3 1 1 Properties of plims It turns out that plims are somewhat easier to work with than expectations because they pass through nonlinear functions Suppose we have two estimators 1 and 2 1 plim g 1 g plim 1 for any continuous function g 2 plim 1 2 plim 1 plim 2 3 plim 1 2 plim 1 plim 2 44 plim 1 2 plim 1 plim 2 Example 1 Is 1 Y an unbiased and or consistent estimator for 1 y Example 2 Is s 2 1 n 1 Yi Y 2 a consistent estimator of 2 3 2 Asymptotic Normality and the Central Limit Theorem While the probability limit tells us whether an estimator converges to the true parameter for a large sample size we d like to know something about the behavior of the distribution as n becomes large Estimators are asymptotically normal if their distribution becomes normal as the sample size gets large Formally if for a a sequence of random variables Z1 Zn P Zn z z asn then Zn is has an asymptotic standard normal distribution or Zn a N 0 1 We can also say that Zn converges in distribution to a standard normal 1 The central limit theorem states that regardless of the underlying distribution a …
View Full Document