DOC PREVIEW
Pitt CS 2710 - Probability

This preview shows page 1-2-3-4-5 out of 16 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 16 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Review  Probability  Random Variables  Joint and Marginal Distributions  Conditional Distribution  Product Rule, Chain Rule, Bayes’ Rule  Inference  Independence 1Random Variables  A random variable is some aspect of the world about which we (may) have uncertainty  R = Is it raining?  D = How long will it take to drive to work?  L = Where am I?  We denote random variables with capital letters  Like variables in a CSP, random variables have domains  R in {true, false} (sometimes write as {+r, r})  D in [0, )  L in possible locations, maybe {(0,0), (0,1), …} 2Probability Distributions  Unobserved random variables have distributions  A distribution is a TABLE of probabilities of values  A probability (lower case value) is a single number  Must have: 3 T P warm 0.5 cold 0.5 W P sun 0.6 rain 0.1 fog 0.3 meteor 0.0Joint Distributions  A joint distribution over a set of random variables: specifies a real number for each assignment (or outcome):  Must obey:  For all but the smallest distributions, impractical to write out T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 4Probabilistic Models  A probabilistic model is a joint distribution over a set of random variables  Probabilistic models:  (Random) variables with domains Assignments are called outcomes  Joint distributions: say whether assignments (outcomes) are likely  Normalized: sum to 1.0  Ideally: only certain variables directly interact  Constraint satisfaction probs:  Variables with domains  Constraints: state whether assignments are possible  Ideally: only certain variables directly interact T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 T W P hot sun T hot rain F cold sun F cold rain T 5 Distribution over T,W Constraint over T,WEvents  An event is a set E of outcomes  From a joint distribution, we can calculate the probability of any event  Probability that it’s hot AND sunny?  Probability that it’s hot?  Probability that it’s hot OR sunny?  Typically, the events we care about are partial assignments, like P(T=hot) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 6Marginal Distributions  Marginal distributions are sub-tables which eliminate variables  Marginalization (summing out): Combine collapsed rows by adding T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 T P hot 0.5 cold 0.5 W P sun 0.6 rain 0.4 7Conditional Probabilities  A simple relation between joint and conditional probabilities  In fact, this is taken as the definition of a conditional probability T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 8Conditional Distributions  Conditional distributions are probability distributions over some variables given fixed values of others T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 W P sun 0.8 rain 0.2 W P sun 0.4 rain 0.6 Conditional Distributions Joint Distribution 9Normalization Trick  A trick to get a whole conditional distribution at once:  Select the joint probabilities matching the evidence  Normalize the selection (make it sum to one) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 T R P hot rain 0.1 cold rain 0.3 T P hot 0.25 cold 0.75 Select Normalize 10Probabilistic Inference  Probabilistic inference: compute a desired probability from other known probabilities (e.g. conditional from joint)  We generally compute conditional probabilities  P(on time | no reported accidents) = 0.90  These represent the agent’s beliefs given the evidence  Probabilities change with new evidence:  P(on time | no accidents, 5 a.m.) = 0.95  P(on time | no accidents, 5 a.m., raining) = 0.80  Observing new evidence causes beliefs to be updated 11Inference by Enumeration  General case:  Evidence variables:  Query* variable:  Hidden variables:  We want:  First, select the entries consistent with the evidence  Second, sum out H to get joint of Query and evidence:  Finally, normalize the remaining entries to conditionalize  Obvious problems:  Worst-case time complexity O(dn)  Space complexity O(dn) to store the joint distribution All variablesThe Product Rule  Sometimes have conditional distributions but want the joint  Example: R P sun 0.8 rain 0.2 D W P wet sun 0.1 dry sun 0.9 wet rain 0.7 dry rain 0.3 D W P wet sun 0.08 dry sun 0.72 wet rain 0.14 dry rain 0.06 13The Chain Rule  More generally, can always write any joint distribution as an incremental product of conditional distributions 14Bayes’ Rule  Two ways to factor a joint distribution over two variables:  Dividing, we get:  Why is this at all helpful?  Lets us build one conditional from its reverse  Often one conditional is tricky but the other one is simple  Foundation of many systems  In the running for most important AI equation! 15Independence  Two variables are independent in a joint distribution if:  Says the joint distribution factors into a product of two simple ones  Usually variables aren’t independent!  Can use independence as a modeling assumption  Independence can be a simplifying assumption  Empirical joint distributions: at best “close” to independent


View Full Document

Pitt CS 2710 - Probability

Documents in this Course
Learning

Learning

24 pages

Planning

Planning

25 pages

Lecture

Lecture

12 pages

Load more
Download Probability
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Probability and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Probability 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?