Unformatted text preview:

3 PROBABILITY 15 3 PROBABILITY In this section, we discuss elements of probability, as a prerequisite for studying random processes. 3.1 Events Define an event space S that has in it a number of events Ai. If the set of possible events Ai covers the space completely, then we will always get one of the events when we take a sample. On the other hand, if some of the space S is not covered with an Ai then it is possible that a sample is not classified as any of the events Ai.Events Ai may be overlapping in the event space, in which case they are composite events; a sample may invoke multiple events. But the Ai may not overlap, in which case they are simple events, and a sample brings only one event Ai, or none if the space S is not covered. In the drawing below, simple events cover the space on the left, and composite events cover the space on the right. A2 A3 A1 A4 A5 A2 A1 A3 A5 A4 S S Intuitively, the probability of an event is the fraction of the number of positive outcomes to the total number of outcomes. Assign to each event a probability, so that we have pi = p(Ai) ≥ 0 p(S)=1. That is, each defined event Ai has a probability of occurring that is greater than zero, and the probability of getting a sample from the entire event space is one. Hence, the probability has the interpretation of the area of the event Ai. It follows that the probability of Ai is exactly one minus the probability of Ai not occuring: p(Ai)=1 − p(A¯i). Furthermore, we say that if Ai and Aj are non-overlapping, then the probability of either Ai or Aj occuring is the same as the sum of the separate probabilities: p(Ai ∪ Aj )= p(Ai)+ p(Aj ).M Ai 3 PROBABILITY 16 Similarly if the Ai and Aj do overlap, then the probability of either or both occurring is the sum of the separate probabilities minus the sum of both occurring: p(Ai ∪ Aj )= p(Ai)+ p(Aj ) − p(Ai ∩ Aj). As a tangible example, consider a six-sided die. Here there are six events A1,A2,A3,A4,A5,A6, corresponding with the six possible values that occur in a sample, and p(Ai)=1/6 for all i. The event that the sample is an even number is M = A2 ∪A4 ∪A6, and this is a composite event. 3.2 Conditional Probability If a composite event M is known to have occurred, a question arises as to the probability that one of the constituent simple events Ai occurred. This is written as P (Aj |M), read as ”the probability of Aj , given M,” and this is a conditional probability. The key concept here is that M replaces S as the event space, so that p(M) = 1. This will have the natural effect of inflating the probabilities of events that are part of event M, and in fact p(Aj |M)= p(Aj ∩ M ) . p(M) Referring to our die example above, if M is the event of an even result, then we have M = A2 ∪ A4 ∪ A6 p(M ∩ A2)= p(A2)=1/6 p(M)=1/2 −→ 1/6 p(A2|M)= 1/2=1/3. Given that an event result was observed (composite event M), the probability that a two was rolled is 1/3. Now if all the Aj are independent (simple) events and M is a composite event, then we can write an opposing rule: p(M)= p(M|A1)p(A1)+ ···+ p(M|An)p(An). This relation collects conditional probabilities of M given each separate event Ai. Its logic is easily seen in a graph. Here is an example of how to use it in a practical problem. Box3 PROBABILITY 17 A has 2000 items in it of which 5% are defective; box B has 500 items with 40% defective; boxes C and D each contain 1000 items with 10% defective. If a box is picked at random, and one item is taken from that box, what is the probability that it is defective? M is the composite event of a defective item, so we are after p(M). We apply the formula above to find p(M)=0.05 ×0.25+0.40 × 0.25+0.10 × 0.25+0.10 × 0.25 = 0.1625. 3.3 Bayes’ Rule Consider a composite event M and a simple event Ai. We have from conditional probability above p(Ai ∩ M ) p(Ai|M)= p(M) p(Ai ∩ M) p(M|Ai)= p(Ai) , and if we eliminate the denominator on the right-hand side, we find that p(M|Ai)= p(Ai|M)p(M) p(Ai) p(M|Ai)p(Ai) p(Ai|M)= p(M) . The second of these is most interesting - it gives the probability of a simple event, conditioned on the composite event, in terms of the composite event conditioned on the simple one! Recalling our above formula for p(M), we thus derive Bayes’ rule: p(Ai|M)= p(M|Ai)p(Ai) . p(M|A1)p(A1)+ ···+ p(M|An)p(An) Here is an example of its use. Consider a medical test that is 99% accurate - it gives a negative result for people who do not have the disease 99% of the time, and it gives a positive result for people who do have the disease 99% of the time. Only one percent of the population has this disease. Joe just got a positive test result: What is the probability that he has the disease? The composite event M is that he has the disease, and the simple events are that he tested positive (+) or he tested negative (−). We apply p(M|+) = p(+|M)p(M) p(+) p(+|M)p(M) = p(+|M)p(M)+ p(+|M¯)p(M¯) 0.99 × 0.01 = 0.99 × 0.01+0.01 × 0.99 =1/2.� � � � 3 PROBABILITY 18 This example is not well appreciated by many healthcare consumers! Here is another example, without so many symmetries. Box A has nine red pillows in it and one white. Box B has six red pillows in it and nine white. Selecting a box at random and pulling out a pillow at random gives the result of a red pillow. What is the probability that it came from Box A? M is the composite event that it came from Box A; the simple event is that a red pillow was collected (R). We have p(M|R)= p(R|M )p(M) p(R) p(R|M)p(M ) = p(R|M )p(M)+ p(R|M¯)p(M¯) 0.9 × 0.5 = 0.9 × 0.5+0.4 × 0.5 =0.692. 3.4 Random Variables Now we assign to each event Ai in the sample space a given value: each Ai corresponds with an xi. For instance, a coin toss resulting in heads could be equated with a $1 reward, and each tails could trigger a $1 loss. Dollar figures could be assigned to each of the faces of a die. Hence we see that if each event Ai has a probability, then so will the numerical values xi. The average value of xi can be approximated of course by sampling the space N times, summing all the x’s, and dividing by N.As N becomes bigger, this computation will give an increasingly accurate result. In terms of probabilities the formula for the expected value is n x¯ = E(x)= p(Ai)xi. i=1 The equivalence of this expected value with the numerical average is


View Full Document

MIT 2 017J - Probability

Documents in this Course
Load more
Download Probability
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Probability and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Probability 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?