CS 70 Discrete Mathematics for CSSpring 2008 David Wagner Note 16Conditional ProbabilityA pharmaceutical company is marketing a new test for a certain medical condition. According to clinicaltrials, the test has the following properties:1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10%(these are called “false negatives”).2. When applied to a healthy person, the test comes up negative in 80% of cases, and positive in 20%(these are called “false positives”).Suppose that the incidence of the condition in the US population is 5%. When a random person is testedand the test comes up positive, what is the probability that the person actually has the condition? (Note thatthis is presumably not the same as the simple probability that a random person has the condition, which isjust120.)This is an example of a conditional probability: we are interested in the probability that a person has thecondition (event A) given that he/she tests positive (event B). Let’s write this as Pr[A|B].How should we compute Pr[A|B]? Well, since event B is guaranteed to happen, we need to look not at thewhole sample space Ω, but at the smaller sample space consisting only of the sample points in B. Whatshould the probabilities of these sample points be? If they all simply inherit their probabilities from Ω, thenthe sum of these probabilities will be∑ω∈BPr[ω] = Pr[B], which in general is less than 1. So we need toscale the probability of each sample point by1Pr[B]. In other words, for each sample pointω∈ B, the newprobability becomesPr[ω|B] =Pr[ω]Pr[B].Now it is clear how to compute Pr[A|B]: namely, we just sum up these scaled probabilities over all samplepoints that lie in both A and B:Pr[A|B] =∑ω∈A∩BPr[ω|B] =∑ω∈A∩BPr[ω]Pr[B]=Pr[A∩ B]Pr[B].Definition (conditional probability): For events A,B in the same probability space, such that Pr[B] > 0, theconditional probability of A given B isPr[A|B] =Pr[A∩ B]Pr[B].Let’s go back to our medical testing example. The sample space here consists of all people in the US. LetN denote the number of people in the US (so N ≈ 250 million). The population consists of four disjointsubsets:CS 70, Spring 2008, Note 16 1TP: the true positives (90% ofN20, i.e.,9N200of them);FP: the false positives (20% of19N20, i.e.,19N100of them);TN: the true negatives (80% of19N20, i.e.,76N100of them);FN: the false negatives (10% ofN20, i.e.,N200of them).We choose a person at random. Recall that A is the event that the person so chosen is affected, and B theevent that he/she tests positive. Note that B is the union of the disjoint sets TP and FP, so|B| = |TP| + |FP| =9N200+19N100=47N200.Thus we havePr[A] =120and Pr[B] =47200.Now when we condition on the event B, we focus in on the smaller sample space consisting only of those47N200individuals who test positive. To compute Pr[A|B], we need to figure out Pr[A ∩ B] (the part of A thatlies in B). But A∩ B is just the set of people who are both affected and test positive, i.e., A∩B = TP. So wehavePr[A∩ B] =|TP|N=9200.Finally, we conclude from the definition of conditional probability thatPr[A|B] =Pr[A∩ B]Pr[B]=9/20047/200=947≈ 0.19.This seems bad: if a person tests positive, there’s only about a 19% chance that he/she actually has thecondition! This sounds worse than the original claims made by the pharmaceutical company, but in fact it’sjust another view of the same data.[Incidentally, note that Pr[B|A] =9/2001/20=910; so Pr[A|B] and Pr[B|A] can be very different. Of course, Pr[B|A]is just the probability that a person tests positive given that he/she has the condition, which we knew fromthe start was 90%.]To complete the picture, what’s the (unconditional) probability that the test gives a correct result (positiveor negative) when applied to a random person? Call this event C. ThenPr[C] =|TP|+|TN|N=9200+76100=161200≈ 0.8.So the test is about 80% effective overall, a more impressive statistic.But how impressive is it? Suppose we ignore the test and just pronounce everybody to be healthy. Thenwe would be correct on 95% of the population (the healthy ones), and wrong on the affected 5%. In otherwords, this trivial test is 95% effective! So we might ask if it is worth running the test at all. What do youthink?Here are a couple more examples of conditional probabilities, based on some of our sample spaces from theprevious lecture note.1. Balls and bins. Suppose we toss m = 3 (labelled) balls into n = 3 bins; this is a uniform sample spacewith 33= 27 points. We already know that the probability the first bin is empty is (1−13)3= (23)3=827.What is the probability of this event given that the second bin is empty? Call these events A,BCS 70, Spring 2008, Note 16 2respectively. To compute Pr[A|B] we need to figure out Pr[A∩ B]. But A∩ B is the event that both thefirst two bins are empty, i.e., all three balls fall in the third bin. So Pr[A∩ B] =127(why?). Therefore,Pr[A|B] =Pr[A∩ B]Pr[B]=1/278/27=18.Not surprisingly,18is quite a bit less than827: knowing that bin 2 is empty makes it significantly lesslikely that bin 1 will be empty.2. Dice. Roll two fair dice. Let A be the event that their sum is even, and B the event that the first die iseven. By symmetry it’s easy to see that Pr[A] =12and Pr[B] =12. Moreover, a little counting gives usthat Pr[A∩ B] =14. What is Pr[A|B]? Well,Pr[A|B] =Pr[A∩ B]Pr[B]=1/41/2=12.In this case, Pr[A|B] = Pr[A], i.e., conditioning on B does not change the probability of A.Independent eventsDefinition (independence): Two events A, B in the same probability space are independent if Pr[A∩ B] =Pr[A] × Pr[B].The intuition behind this definition is the following. Suppose that Pr[B] > 0 and A,B are independent. Thenwe havePr[A|B] =Pr[A∩ B]Pr[B]=Pr[A] × Pr[B]Pr[B]= Pr[A].Thus independence has the natural meaning that “the probability of A is not affected by whether or not Boccurs.” (By a symmetrical argument, we also have Pr[B|A] = Pr[B] provided Pr[A] > 0.) For events A,Bsuch that Pr[B] > 0, the condition Pr[A|B] = Pr[A] is actually equivalent to the definition of independence.Examples: In the balls and bins example above, events A,B are not independent. In the dice example, eventsA,B are independent.The above definition generalizes to any finite set of events:Definition (mutual independence): Events A1,.. ., Anare mutually independent if for every subset I ⊆{1,.. ., n},Pr[Ti∈IAi]
View Full Document