DOC PREVIEW
ISU STAT 511 - Bayes Overview

This preview shows page 1-2-3-4-30-31-32-33-34-61-62-63-64 out of 64 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 64 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Bayes OverviewNutshellA Simple Discrete ExampleLikelihoodPriorJointPosteriorInferencePredictionA Fairly Simple Continuous ExampleLikelihoodPriorPosteriorThe Qualitative Nature of BayesPredcitionSummaryBayes ComputationComputation for Non-SpecialistsWinBUGS Example 1WinBUGS Example 2WinBUGS Example 3WinBUGS Example 4WinBUGS Example 5Stat 511 Bayes Overview(Why You Must Take Stat 544 Before You Are Done)Prof. Stephen VardemanISU Statistics and IMSEApril 2008Vardeman (IS U Statistics and IMSE) B a yes Overview A pril 2008 1 / 64In a NutshellBayesian StatisticsReplaces a family of models Fθwith a single probability model ... bytreating the Fθas conditional distributions of the data given θ andplacing a "prior" probability distribution on θ (leading to a jointdistribution for data and parameter)Replaces inference based on a likelihood with inference based on a"posterior" (the conditional distribution of parameter given data) ...ultimatelyL(θ)is replaced by L(θ)g(θ)In its modern incarnation, is implemented primarily throughsimulation from the posterior instead of through pencil and paperStat 542 calculus with joint and conditional distributionsVardeman (IS U Statistics and IMSE) B a yes Overview A pril 2008 2 / 64A Simple Discrete Example (Likelihood/Data Model)Suppose that a random quantity X with possible values 1, 2, 3, and 4, hasa distribution that depends upon some parameter θ that has possiblevalues 1, 2, and 3. We’ll suppose the three possible distributions of X(appropriate under the three di¤erent values of θ) may be speci…ed byprobability mass functions f(xjθ)given in Table 1.Table 1 Three Possible Distributions of Xx f(xj1)x f(xj2)x f(xj3)1 .4 1 .25 1 .12 .3 2 .25 2 .23 .2 3 .25 3 .34 .1 4 .25 4 .4(“Big” parameter values will tend to produce “big” values of X .)Vardeman (IS U Statistics and IMSE) B a yes Overview A pril 2008 3 / 64A Simple Discrete Example (Prior)A “Bayes” approach to making inferences about θ based on an observationX = x requires speci…cation of a “prior” probability distribution for theunknown parameter θ. For sake of illustration, cosider a probability massfunctions g(θ)given in Table 2.Table 2 A “Prior” Distribution for θθ g(θ)1 .52 .33 .2Vardeman (IS U Statistics and IMSE) B a yes Overview A pril 2008 4 / 64Simple Discrete Example (Joint)A given prior distribution together with the forms speci…ed for thedistribution of X given the value of θ (the "likelihood") leads to a jointdistribution for both X and θ with probability mass function g(θ, x)thatcan be represented in a two- way table, where any entry is obtained asg(θ, x)= f(xjθ)g(θ)= likelihood priorFor example, using the prior distribution g(θ)in Table 2, one obtains thejoint distribution g(θ, x)speci…ed in Table 3.Vardeman (IS U Statistics and IMSE) B a yes Overview A pril 2008 5 / 64A Simple Discrete Example (Joint cont.)Table 3 Joint Distribution for X and θ Corresponding to g(θ)θ = 1 θ = 2 θ = 3x = 1g(1, 1)=.4(.5)= .2g(2, 1)=.25(.3)= .075g(3, 1)=.1(.2)= .02f(1)= .295x = 2g(1, 2)=.3(.5)= .15g(2, 2)=.25(.3)= .075g(3, 2)=.2(.2)= .04f(2)= .265x = 3g(1, 3)=.2(.5)= .1g(2, 3)=.25(.3)= .075g(3, 3)=.3(.2)= .06f(3)= .235x = 4g(1, 4)=.1(.5)= .05.g(2, 4)=.25(.3)= .075g(3, 4)=.4(.2)= .08f(4)= .205g(1)= .5 g(2)= .3 g(3)= .2Vardeman (IS U Statistics and IMSE) B a yes Overview A pril 2008 6 / 64A Simple Discrete Example (Posterior)The crux of the Bayes paradigm is that a joint probability distribution forX and θ, can b e use d not only to recover f(xjθ)and the marginaldistribution of θ (the “likelihood” and the “prior distribution” that aremultiplied together to get the joint distribution in the …rst place), but alsoto …nd conditional distributions for θ given possible values of X . In thecontext of Bayes analysis, these are called the p oste rior distributions of θ.For tables laid out as ab ove , they are found by “dividing rows by rowtotals.” We might use the notation g(θjx)for a posterior distribution andnote that for a given x, values of this are proportional to jointprobabilities, i.e.g(θjx)∝ f(xjθ)g(θ)i.e.posterior ∝ likelihood priorTake, for example the situation of the simple discrete example.Vardeman (IS U Statistics and IMSE) B a yes Overview A pril 2008 7 / 64A Simple Discrete Example (Posterior cont.)The four possible posterior distributions of θ (given the observed value ofX = x) are as in Table 4.Table 4 Posterior Distributionsθ = 1 θ = 2 θ = 3g(θj1).2/.295 =.6780.075/.295 =.2542.02/.295 =.0678g(θj2).15/.265 =.5660.075/.265 =.2830.04/.265 =.1509g(θj3).4255 .3191 .2553g(θj4).2439 .3659 .3902Vardeman (IS U Statistics and IMSE) B a yes Overview A pril 2008 8 / 64A Simple Discrete Example (Inference)The “Bayes paradigm” of inference is to base all formal inferences(plausibility statements) about θ on a posterior distribution of θ. Forexample, an analyst who adopts prior g and observes X = 3 may correctlysay that there is (posterior) probability .2553 that θ = 3. Notice, that thisis a di¤erent concept than the non-Bayesian concept of “con…dence.”Vardeman (IS U Statistics and IMSE) B a yes Overview A pril 2008 9 / 64A Simple Discrete Example (Prediction)In many real contexts, one is doing inference based on data X = x for thepurpose of predicting the value of a n as yet unobserved variable, Xn ew. If(given the value of θ) one is willing to model X and Xnewas independentvariables, one can extend the Bayes paradigm beyond inference for θ to theprediction problem. That is, conditioned on having observed X = x onehas a posterior distribution for both θ and Xnewwith a (joint) probabilitymass function that can be represented in a two way table, where eachentry has the formg(θ, xnewjx)= f(xnewjθ)g(θjx)This can be added across values of θ to produce a posterior predictivedistribution for Xnewasg(xn ewjx)=∑θg(θ, xn ewjx)=∑θf(xn ewjθ)g(θjx)Vardeman (IS U Statistics and IMSE) B a yes Overview A pril 2008 10 / 64A Simple Discrete Example (Prediction cont.)That is, the posterior predictive distribution of Xn ewis what one gets uponweighting the three possible distributions for Xnewin Table 1 according tothe posterior probabilities in Table 4. Considering …rst the possibility thatX = 1, note that the conditional distribution for Xn ewgiven this outcomeisxnewg(xn ewj1)1.4( .6780) + .25(.2542)+.1(.0678)= .341532.3( .6780) + .25(.2542)+.2(.0678)= .280513.2( .6780) + .25(.2542)+.3(.0678)=


View Full Document

ISU STAT 511 - Bayes Overview

Download Bayes Overview
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Bayes Overview and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Bayes Overview 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?