This preview shows page 1 out of 3 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

� � Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Fall 2010) Recitation 20: November 18, 2010 1. In your summer internship, you are working for the world’s largest producer of lightbulbs. Your manager asks you to estimate the quality of production, that is, to estimate the probability p that a bulb produced by the factory is defectless. You are told to assume that all lightbulbs have the same probability of having a defect, and that defects in different lightbulbs are independent. (a) Suppose that you test n randomly picked bulbs, what is a good estimate Zn for p, such that Zn converges to p in probability? (b) If you test 50 light bulbs, what is the probability that your estimate is in the range p ± 0.1? (c) The management asks that your estimate falls in the range p ± 0.1 with probability 0.95. How many light bulbs do you need to test to meet this specification? 2. pXn (x) pY (y) n 1 11- - 1- -n n1 1--nn 0 1x 0 n y Let Xn and Yn have the distributions shown above. (a) Find the expected value and variance of Xn and Yn. (b) What does the Chebyshev inequality tell us about the convergence of Xn? Yn? (c) Is Yn convergent in probability? If so, to what value? (d) If a sequence of random variables converges in probability to a, does the corresponding sequence of expected values converge to a? Prove or give a counter example. A sequence of random variables is said to converge to a number c in the mean square, if lim E (Xn − c)2 = 0. n→∞ (e) Use Markov’s inequality to show that convergence in the mean square implies convergence in probability. (f) Give an example that shows that convergence in probability does not imply convergence in the mean square. 3. Random variable X is uniformly distributed between −1.0 and 1.0. Let X1, X2, . . ., be inde-pendent identically distributed random variables with the same distribution as X. Determine which, if any, of the following sequences (all with i = 1, 2, . . .) are convergent in probability. Give reasons for your answers. Include the limits if they exist. Page 1 of 2Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Fall 2010) (a) Xi Xi(b) Yi = i (c) Zi = (Xi)i Page 2 of 2MIT OpenCourseWare http://ocw.mit.edu 6.041 / 6.431 Probabilistic Systems Analysis and Applied Probability Fall 2010 For information about citing these materials or our Terms of Use, visit:


View Full Document

MIT 6 041 - Lecture Notes

Documents in this Course
Quiz 1

Quiz 1

5 pages

Quiz 2

Quiz 2

6 pages

Quiz 1

Quiz 1

11 pages

Quiz 2

Quiz 2

2 pages

Syllabus

Syllabus

11 pages

Quiz 2

Quiz 2

7 pages

Quiz 1

Quiz 1

6 pages

Quiz 1

Quiz 1

11 pages

Quiz 2

Quiz 2

13 pages

Quiz 1

Quiz 1

13 pages

Load more
Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?