Unformatted text preview:

1M. HauskrechtCS 441 Discrete mathematics for CSCS 441 Discrete Mathematics for CSLecture 20Milos [email protected] Sennott SquareProbabilities M. HauskrechtProbabilitiesThree axioms of the probability theory:(1) Probability of a discrete outcome is:• 0 < = P(s) <= 1(2) Sum of probabilities of all (disjoint) outcomes is = 1(3) For any two events E1 and E2 holds:P(E1 ∪ E2) = P(E1) + P(E2) – P(E1 ∩ E2)2M. HauskrechtProbability distributionDefinition: A function p: S  [0,1] satisfying the three conditions is called a probability distributionExample: a biased coin • Probability of head 0.6, probability of a tail 0.4• Probability distribution:– Head  0.6 The sum of the probabilities sums to 1– Tail  0.4Note: a uniform distribution is a special distribution that assigns equal probabilities to each outcome.M. HauskrechtProbability of an EventDefinition: The probability of the event E is the sum of the probabilities of the outcomes in E.• Note that now no assumption is being made about the distribution. Complement:• P(~E) = 1 – P(E)3M. HauskrechtComplementsLet E and F are two events. Then:• P(F) = P(F ∩ E) + P(F ∩ ~E) Sample spaceE~EFM. HauskrechtConditional probabilityDefinition: Let E and F be two events such that P(F) > 0. The conditional probability of E given F• P(E| F) = P(E ∩ F) / P(F)Corrolary: Let E and F be two events such that P(F) > 0. Then: •P(E ∩ F) = P(E| F) * P(F)This result is also referred to as a product rule.4M. HauskrechtConditional probabilityProduct rule: •P(E ∩ F) = P(E|F) P(F)Example: • Assume the probability of getting a flu is 0.2• Assume the probability of having a high fever given the flu: 0.9What is the probability of getting a flu with fever?P(flu ˄ fever) = P(fever | flu)P(flu) = 0.9*0.2 = 0.18 • When is this useful? Sometimes conditional probabilities are easier to estimate. M. HauskrechtBayes theoremDefinition: Let E and F be two events such that P(F) > 0. Then:• P(E| F) = P(F| E)P(E) / P(F)= P(F|E)P(E) / (P(F|E)P(E) + P(F|~E)P(~E))Proof: P(E| F) = P(E ∩ F) / P(F)= P( F| E) P(E) / P(F)P(F) = P(F ∩ E) + P(F ∩ ~E)= P( F| E) P(E) + P( F| ~E) P(~E) Hence: P(E| F) = P(F|E)P(E) / (P(F|E)P(E) + P(F|~E)P(~E))Idea: Simply switch the conditioning events.5M. HauskrechtBayes theoremDefinition: Let E and F be two events such that P(F) > 0. Then: • P(E| F) = P(F| E)P(E) / P(F)= P(F|E)P(E) / (P(F|E)P(E) + P(F|~E)P(~E))Example:• Assume the probability of getting a flu is 0.2• Assume the probability of getting a fever is 0.3• Assume the probability of having a high fever given the flu: 0.9• What is the probability of having a flu given the fever?M. HauskrechtBayes theoremDefinition: Let E and F be two events such that P(F) > 0. Then: • P(E| F) = P(F| E)P(E) / P(F)= P(F|E)P(E) / (P(F|E)P(E) + P(F|~E)P(~E))Example:• Assume the probability of getting a flu is 0.2• Assume the probability of getting a fever is 0.3• Assume the probability of having a fever given the flu: 0.9• What is the probability of having a flu given the fever?• P(flu | fever) = P(fever | flu) P(flu) / P(fever) == 0.9 x 0.2/0.3 = 0.18/0.3 = 0.66M. HauskrechtBayes theoremDefinition: Let E and F be two events such that P(F) > 0. Then: •P(E| F) = P(F| E)P(E) / P(F)= P(F|E)P(E) / (P(F|E)P(E) + P(F|~E)P(~E))Example (same as above but different probabilities are given):• Assume the probability of getting a flu is 0.2• Assume the probability of having a fever given the flu: 0.9• Assume the probability of having a fever given no flue: 0.15• What is the probability of having a flu given the fever?M. HauskrechtBayes theoremDefinition: Let E and F be two events such that P(F) > 0. Then: •P(E| F) = P(F| E)P(E) / P(F)= P(F|E)P(E) / (P(F|E)P(E) + P(F|~E)P(~E))Example:• Assume the probability of getting a flu is 0.2• Assume the probability of having a fever given the flu: 0.9• Assume the probability of having a fever given no flue: 0.15• What is the probability of having a flu given the fever?• P(flu | fever) = P(fever | flu) P(flu) / P(fever)• P(fever) = P(fever | flu) P(flu) + P(fever | ~flu) P(~flu)= 0.9*0.2 + 0.15*0.8 = 0.3 P(flu | fever) = 0.9 x 0.2/0.3 = 0.18/0.3 = 0.67M. HauskrechtIndependenceDefinition: The two events E and F are said to be independent if:•P(E ∩ F) = P(E)P(F) M. HauskrechtIndependenceDefinition: The events E and F are said to be independent if:•P(E ∩ F) = P(E)P(F) Example. Assume that E denotes the family has three children of both sexes and F the fact that the family has at most one boy. Are E and F independent?• All combos = {BBB, BBG, BGB, GBB,BGG,GBG,GGB,GGG} the number of elements = 8• Both sexes = {BBG BGB GBB BGG GBG GGB} # = 6• At most one boy = {GGG GGB GBG BGG} # = 4•E ∩ F = {GGB GBG BGG} # = 3•P(E ∩ F) = 3/8 and P(E)*P(F)= 4/8 6/8 = 3/8• The two probabilities are equal  E and F are independent8M. HauskrechtIndependenceExample:• Assume the probability of getting a flu is 0.2• Assume the probability of getting a fever is 0.3• Assume the probability of having a fever given the flu: 0.9• Are flu and fever independent ? •P(flu ∩ fever) = P( fever | flu) * P(flu) = 0.2 * 0.9 = 0.18• P(flu) * P(fever) = 0.2 * 0.3 = 0.06• Independent or not? Not independentM. HauskrechtCS 1571 Intro to AIBernoulli trialAssume: • p = 0.6 is a probability of seeing head• 0.4 is the probability of seeing tailAssume we see a sequence of independent coin flips: • HHHTTHTHHT• The probability of seeing this sequence:0.6 6* 0.4 4• What is the probability of seeing a sequence of with 6 Heads and 4 tails?• The probability of each such sequence is 0.6 6* 0.4 4• How many such sequences are there: C(10,4)• P(6H and 4T) = C(10,4) *0.6 6* 0.4 49M. HauskrechtRandom variables• Definition: A random variable is a function from the sample space of an experiment to the set of real numbers f: S  R. A random variable assigns a number to each possible outcome.• The distribution of a random variable X on the sample space S is a set of pairs (r p(X=r) ) for all r in S where r is the number and p(X=r) is the probability that X takes a value r. M. HauskrechtRandom variablesExample:Let S be the outcomes of a two-dice roll Let random variable X denotes the sum of outcomes(1,1)  2 (1,2) and (2,1)  3(1,3), (3,1) and (2,2)  4… Distribution of X:•2  1/36, •3  2/36, •4  3/36


View Full Document

Pitt CS 0441 - Probabilities

Download Probabilities
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Probabilities and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Probabilities 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?