DOC PREVIEW
UCI ICS 171 - Dealing With Uncertainty

This preview shows page 1-2-16-17-18-33-34 out of 34 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Dealing With Uncertainty P(X|E)HistoryConcernsBasic IdeaProbability Models: Basic QuestionsDiscrete Probability ModelRandom VariableCross-Product RVDiscrete Probability DistributionFrom Model to Prediction Use Math or SimulationSlide 11Slide 12Learning Model: Hill ClimbingSlide 14Mixture ModelContinuous ProbabilityJoint Probability: full knowledgeMarginalizationMarginalization ExampleConditional ProbabilityFormulaConditional ExampleExact and simulatedNote Joint yields everythingSimulationConsequences of Bayes RulesExtensions of P(A) +P(~A) = 1Bayes Rule ExampleBayes Rule: multiple symptomsNotation: max argIdiot or Naïve Bayes: First learning AlgorithmChain Rule and Markov ModelsMarkov ModelsCommon DNA applicationDealing With UncertaintyP(X|E)Probability theoryThe foundation of StatisticsChapter 13History•Games of chance: 300 BC•1565: first formalizations•1654: Fermat & Pascal, conditional probability•Reverend Bayes: 1750’s•1950: Kolmogorov: axiomatic approach•Objectivists vs subjectivists– (frequentists vs Bayesians)•Frequentist build one model•Bayesians use all possible models, with priorsConcerns•Future: what is the likelihood that a student will get a CS job given his grades?•Current: what is the likelihood that a person has cancer given his symptoms?•Past: what is the likelihood that Marilyn Monroe committed suicide?•Combining evidence.•Always: Representation & InferenceBasic Idea•Attach degrees of belief to proposition.•Theorem: Probability theory is the best way to do this.–if someone does it differently you can play a game with him and win his money.•Unlike logic, probability theory is non-monotonic. •Additional evidence can lower or raise belief in a proposition.Probability Models: Basic Questions•What are they?–Analogous to constraint models, with probabilities on each table entry•How can we use them to make inferences?–Probability theory•How does new evidence change inferences–Non-monotonic problem solved•How can we acquire them? –Experts for model structure, hill-climbing for parametersDiscrete Probability Model• Set of RandomVariables V1,V2,…Vn• Each RV has a discrete set of values•Joint probability known or computable•For all vi in domain(Vi), Prob(V1=v1,V2=v2,..Vn=vn) is known, non-negative, and sums to 1.Random Variable•Intuition: A variable whose values belongs to a known set of values, the domain.•Math: non-negative function on a domain (called the sample space) whose sum is 1.•Boolean RV: John has a cavity. –cavity domain ={true,false}•Discrete RV: Weather Condition–wc domain= {snowy, rainy, cloudy, sunny}. •Continuous RV: John’s height–john’s height domain = { positive real number}Cross-Product RV•If X is RV with values x1,..xn and–Y is RV with values y1,..ym, then–Z = X x Y is a RV with n*m values <x1,y1>…<xn,ym>•This will be very useful!•This does not mean P(X,Y) = P(X)*P(Y).Discrete Probability Distribution•If a discrete RV X has values v1,…vn, then a prob distribution for X is non-negative real valued function p such that: sum p(vi) = 1.•This is just a (normalized) histogram.•Example: a coin is flipped 10 times and heads occur 6 times.•What is best probability model to predict this result?•Biased coin model: prob head = .6, trials = 10From Model to PredictionUse Math or Simulation•Math: X = number of heads in 10 flips•P(X = 0) = .4^10•P(X = 1) = 10* .6*.4^9•P(X = 2) = Comb(10,2)*.6^2*.4^8 etc•Where Comb(n,m) = n!/ (n-m)!* m!.•Simulation: Do many times: flip coin (p = .6) 10 times, record heads.•Math is exact, but sometimes too hard.•Computation is inexact and expensive, but doablep=.6 Exact 10 100 10000 .0001 .0 .0 .01 .001 .0 .0 .0022 .010 .0 .01 .0113 .042 .0 .04 .0424 .111 .2 .05 .1175 .200 .1 .24 .2006 .250 .6 .22 .2467 .214 .1 .16 .2318 .120 .0 .18 .1089 .43 .0 .09 .03510 .005 .0 .01 .008P=.5 Exact 10 100 10000 .0009 .0 .0 .0021 .009 .0 .01 .0112 .043 .0 .07 .0443 .117 .1 .13 .1014 .205 .2 .24 .2315 .246 .0 .28 .2186 .205 .3 .15 .2247 .117 .3 .08 .1188 .043 .1 .04 .0469 .009 .0 .0 .00910 .0009 .0 .0 .001Learning Model: Hill Climbing•Theoretically it can be shown that p = .6 is best model.•Without theory, pick a random p value and simulate. Now try a larger and a smaller p value.•Maximize P(Data|Model). Get model which gives highest probability to the data.•This approach extends to more complicated models (variables, parameters).Another Data SetWhat’s going on?0 .341 .382 .193 .054 .015 .026 .087 .208 .309 .2610 .1Mixture Model•Data generated from two simple models•coin1 prob = .8 of heads•coin2 prob = .1 of heads•With prob .5 pick coin 1 or coin 2 and flip.•Model has more parameters•Experts are supposed to supply the model.•Use data to estimate the parameters.Continuous Probability•RV X has values in R, then a prob distribution for X is a non-negative real-valued function p such that the integral of p over R is 1. (called prob density function)•Standard distributions are uniform, normal or gaussian, poisson, etc.•May resort to empirical if can’t compute analytically. I.E. Use histogram.Joint Probability: full knowledge•If X and Y are discrete RVs, then the prob distribution for X x Y is called the joint prob distribution.•Let x be in domain of X, y in domain of Y.•If P(X=x,Y=y) = P(X=x)*P(Y=y) for every x and y, then X and Y are independent.•Standard Shorthand: P(X,Y)=P(X)*P(Y), which means exactly the statement above.Marginalization•Given the joint probability for X and Y, you can compute everything.•Joint probability to individual probabilities.•P(X =x) is sum P(X=x and Y=y) over all y•Conditioning is similar:–P(X=x) = sum P(X=x|Y=y)*P(Y=y)Marginalization Example•Compute Prob(X is healthy) from•P(X healthy & X tests positive) = .1•P(X healthy & X tests neg) = .8•P(X healthy) = .1 + .8 = .9•P(flush) = P(heart flush)+P(spade flush)+ P(diamond flush)+ P(club flush)Conditional Probability• P(X=x | Y=y) = P(X=x, Y=y)/P(Y=y).•Intuition: use simple examples•1 card hand X = value card, Y = suit card P( X= ace | Y= heart) = 1/13 also P( X=ace , Y=heart) = 1/52 P(Y=heart) = 1 / 4 P( X=ace, Y= heart)/P(Y =heart) = 1/13.Formula•Shorthand: P(X|Y) = P(X,Y)/P(Y).•Product Rule: P(X,Y) = P(X |Y) * P(Y)•Bayes Rule:–P(X|Y) = P(Y|X) *P(X)/P(Y).•Remember the abbreviations.Conditional Example•P(A = 0) = .7•P(A = 1) = .3P(A,B) =


View Full Document

UCI ICS 171 - Dealing With Uncertainty

Documents in this Course
Prolog

Prolog

16 pages

PROJECT

PROJECT

3 pages

Quiz 6

Quiz 6

9 pages

Load more
Download Dealing With Uncertainty
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Dealing With Uncertainty and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Dealing With Uncertainty 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?