Econ 3120 1st Edition Lecture 5 Outline of Last Lecture I Marginal Distributions II Expectations and Variance III Covariance Outline of Current Lecture IV Asymptotic Normality V Central Limit Theorem VI Distribution of Difference in Means Current Lecture 3 2 Asymptotic Normality and the Central Limit Theorem While the probability limit tells us whether an estimator converges to the true parameter for a large sample size we d like to know something about the behavior of the distribution as n becomes large Estimators are asymptotically normal if their distribution becomes normal as the sample size gets large Formally if for a a sequence of random variables Z1 Zn P Zn z z asn then Zn is has an asymptotic standard normal distribution or Zn a N 0 1 We can also say that Zn converges in distribution to a standard normal 1 The central limit theorem states that regardless of the underlying distribution a standardized sample mean has a standard normal distribution as the sample size gets large This very important result will come in very handy for conducting inference on means of random samples with nonnormal or unknown distributions Formally the central limit theorem states that if Y1 Y2 Yn are a random sample with mean and variance 2 then Zn Y n n approaches a standard normal distribution as the sample size gets large A remark The sample mean itself Y n converges to the true mean in large samples with 0 variance Similarly Y n also converges to a single point in large samples We divide the denominator by n to keep the variance constant as the sample size gets large The central limit theorem also applies when 2 must be estimated i e in most cases In particular Y n sn n has an asymptotic standard normal distribution as well This follows because sn is a consistent estimator of Therefore as the sample size gets large sn converges to Example Monte Carlo simulation The graphs below illustrate the central limit theorem using samples drawn from a Bernoulli distribution The Bernoulli distribution is a discrete distribution that describes a random variable that can take on the value of 1 with probability p and 0 with probability 1 p First let s compute the variance of a Bernoulli random variable Back to the simulation I chose p 0 1 I then estimated a distribution for Y n using different values of n by creating 500 samples of size n from the Bernoulli distribution with mean 0 1 and plotting the resulting These notes represent a detailed interpretation of the professor s lecture GradeBuddy is best used as a supplement to your own notes not as a substitute standardized means as histograms That is for each sample I computed the standardized random variable Zn Y n n where p 0 1 1 0 1 Note that I could have replaced with s q n n 1 Y n 1 Y n 3 3 Distribution of Difference in Means One important implication of the central limit theorem is that you can apply it to differences in means Suppose you have two i i d samples X1 Xn1 and Y1 Yn1 of size n1 and n2 respectively Then Z X Y x y q Var X Y X Y x y q s 2 x n1 s 2 y n2 is asymptotically standard normal Note that to be able to use asymptotic properties we need both n1 and n2 to be 30
View Full Document