Name: ID:Homework for 1/22 and 1/241. [§6-4] Let X1, X2, . . . , X8be i.i.d. normal random variables with mean µand standard deviation σ. DefineT =X − µpS2/n,where X is the sample mean and S2is the sample variance.(a) Find τ1such that P(|T | < τ1) = .9; and(b) find τ2such that P(T > τ2) = .05.(a) First, we notice that T follows a t7distribution. Since the t distri-bution is symmetric about 0, we have0.9 = P(|T | < τ1) = P(T < τ1) − P(T < −τ1) = 2P(T < τ1) − 1.Thus P(T < τ1) = 0.95 and τ1= 1.895.(b) Since P(T > τ2) = .05, we have P(T < τ2) = 0.95. Thus τ2= 1.895.2. [§8-4] Suppose that X is a discrete random variable withP(X = 0) =23θ, P(X = 1) =13θ,P(X = 2) =23(1 − θ), P(X = 3) =13(1 − θ).where 0 ≤ θ ≤ 1 is a parameter. The following 10 independent observa-tions were taken from such a distribution: (3, 0, 2, 1, 3, 2, 1, 0, 2, 1).(a) Find the method of moments estimate of θ.(b) What is the maximum likelihood estimate of θ?(a) In general, let X1, X2, . . . , Xnbe a random sample drawn from thisdistribution. Sinceµ1= E[X] = 0 ·23θ + 1 ·13θ + 2 ·23(1 − θ) + 3 ·13(1 − θ) =73− 2θ.we haveθ =76−12µ1,and the MME for θ isˆθ =76−12X,where X =1nnXi=1Xi. In this case, we have X = 3/2 andˆθ =76−12·32=512.(b) The likelihood function of the sample (3, 0, 2, 1, 3, 2, 1, 0, 2, 1) islik(θ) = f(3, 0, 2, 1, 3, 2, 1, 0, 2, 1|θ)=23θ2·13θ3·23(1 − θ)3·13(1 − θ)2=235135θ5(1 − θ)5.Sincel(θ) = log lik(θ) = 5 log23+ 5 log13+ 5 log θ + 5 log(1 − θ),l0(θ) =5θ−51 − θ,l00(θ) = −5θ2−5(1 − θ)2< 0,we havel12= max0≤θ≤1l(θ).Thus in this case, the MLE is˜θ =12.23. [§8-7] Suppose that X follows a geometric distribution,P(X = k) = p(1 − p)k−1and assume an i.i.d. sample of size n.(a) Find the method of moments estimate of p.(b) Find the mle of p.(The moments of geometric distribution can be found in P117.)(a) In general, let X1, X2, . . . , Xnbe a random sample drawn from thisdistribution. Sinceµ1= E[X] =1p,we havep =1µ1,and the MME for p isˆp =1X,where X =1nnXi=1Xi.(b) The log-likelihood function of a sample X1, X2, . . . , Xnisl(p) = log f(X1, X2, . . . , Xn|p) = lognYi=1p(1 − p)Xi−1=nXi=1(log p + Xilog(1 − p) − log(1 − p))= n log p − n log(1 − p) + nX log(1 − p).Sinceddpl(p) =np+n1 − p−nX1 − p,we have l0(1/X) = 0. Moreover, sinced2dp2l(p) = −np2+n(1 − p)2−nX(1 − p)2,it follows that l00(p) < 0 for all 0 ≤ p ≤ 1 (since X ≥ 1) andmax0≤p≤1l(p) = l1X.Thus in this case, the MLE is ˜p = 1/X.34. [§8-16] Consider an i.i.d. sample of random variables with density functionf(x|σ) =12σexp−|x|σ, −∞ < x < ∞, σ > 0.(This is called the Laplace distribution, or double exponential distribu-tion.)(a) Find the method of moments estimate of σ.(b) Find the maximum likelihood estimate of σ.(a) In general, let X1, X2, . . . , Xnbe a random sample drawn from thisdistribution. Sinceµ1= E[X] =Z∞−∞xf(x) dx =Z∞−∞x ·12σexp−|x|σdx = 0,(the integrand is an odd function), andµ2= E[X2] =Z∞−∞x2f(x) dx =Z∞−∞x2·12σexp−|x|σdx= 2Z∞0x2·12σexp−|x|σdx (even integrand)=Z∞0x2·1σexp−xσdx= E[Y2] = 2σ2,where Y ∼ Exp(1/σ). Thus we haveσ =rµ22,and the MME for σ isˆσ =rˆµ22,where ˆµ2=1nnXi=1X2i.(b) The log-likelihood function of a sample X1, X2, . . . , Xnisl(p) = log f(X1, X2, . . . , Xn|σ) = lognYi=112σexp−|Xi|σ=nXi=1− log 2 − log σ −|Xi|σ= −n log 2 − n log σ −Pni=1|Xi|σ.4Sinceddσl(σ) = −nσ+Pni=1|Xi|σ2,d2dσ2l(σ) =nσ2−2Pni=1|Xi|σ3,we have l0 1nnXi=1|Xi|!= 0 and l00 1nnXi=1|Xi|!< 0.Therefore,max0≤σ≤1l(σ) = l 1nnXi=1|Xi|!.Thus in this case, the MLE is ˜σ =1nnXi=1|Xi|.Remark: note that we would obtain a same MME if we use thefollowing moment ν = E[|X|].55. Consider an i.i.d. sample of random variables with density functionf(x|θ) =(1 θ ≤ x ≤ θ + 10 otherwise.(a) Find the method of moments estimate of θ.(b) Find the maximum likelihood estimate of θ.(a) In general, let X1, X2, . . . , Xnbe a random sample drawn from thisdistribution. Sinceµ1= E[X] =Z∞−∞xf(x) dx =Zθ+1θx · 1 dx =12x2θ+1θ=12+ θ,we haveθ = µ1−12,and the MME for θ isˆθ = X −12,where X =1nnXi=1Xi.(b) The likelihood function of a sample X1, X2, . . . , Xnisl(θ) =nYi=11{θ ≤ Xi≤ θ + 1}= 1{θ ≤ min1≤i≤nXiand θ + 1 ≥ max1≤i≤nXi}.(The logic is that if θ > min Xi, then there exists some i0such thatθ > Xi0. Thus 1{θ ≤ Xi0≤ θ + 1} = 0. Similar arguments apply toθ + 1 < max Xi.) For any estimator˜θ satisfyingmax1≤i≤nXi− 1 ≤˜θ ≤ min1≤i≤nXi,the likelihood function l(˜θ) = 1. Thus˜θ is an mle. In particular,min1≤i≤nXiand max1≤i≤nXi− 1 are two mle for θ.6. [§8-21] Suppose that X1, X2, . . . , Xnare i.i.d. with density functionf(x|θ) =(e−(x−θ), x ≥ θ0, otherwise.(a) Find the method of moments estimate of θ.6(b) Find the mle of θ. (Hint: Be careful, and don’t differentiate beforethinking. For what values of θ is the likelihood positive?)(a) In general, let X1, X2, . . . , Xnbe a random sample drawn from thisdistribution. Sinceµ1= E[X] =Z∞−∞xf(x) dx =Z∞θx · e−(x−θ)dx= eθZ∞θxe−xdx integration by parts= eθ−xe−x∞θ−Z∞θ−e−xdx= eθθe−θ+Z∞θe−xdx= eθθe−θ− e−x∞θ= eθθe−θ+ e−θ = θ + 1,we haveθ = µ1− 1,and the MME for θ isˆθ = X − 1,where X =1nnXi=1Xi.(b) The likelihood function of a sample X1, X2, . . . , Xnisl(θ) =nYi=11{θ ≤ Xi}e−(Xi−θ)= 1{θ ≤ min1≤i≤nXi}e−nX+nθ.Since e−nX+nθis an increasing function in θ and 1{θ ≤ min1≤i≤nXi} =0 when θ > min1≤i≤nXi, the MLE of θ is˜θ =
View Full Document