New version page

Epistemology Probabilized

This preview shows page 1-2-3-4-5-6 out of 17 pages.

View Full Document
View Full Document

End of preview. Want to read all 17 pages?

Upload your study docs or become a GradeBuddy member to access this document.

View Full Document
Unformatted text preview:

Epistemology ProbabilizedRichard JeffreyApril 2, 2002Here is a framework for judgment in terms of a continuum of “sub-jective” probabilities, a framework in which probabilistic judgmentsneed not stand on a foundation of certainties. In place of proposi-tional data bases, this “radical” probabilism (“probabilities all theway down to the roots”) envisages full or partial probability assign-ments to probability spaces, together with protocols for revising thoseassignments and their interconnections in the light of fresh empiricalor logico-mathematical input. This input need not be of the limiting 0-or-1 sort. Updating by ordinary conditioning is generalized (sec. 2.2)to probability kinematics, where an observation on a random variableX need not single out one value, but may prompt a new probability distribution over all values of X.The effect of an observation itself, apart from the influence of priorprobabilities (sec. 3), is given by the (“Bayes”) factorsnew oddsold oddsbywhich the observer’s odds between hypotheses are updated. We arenot generally interested in adopting an observer’s new odds as ourown, for those are influenced by the observer’s old odds, not ours. Itis rather the observer’s Bayes’s factors that we need in order to usethat observation in our own judgments. An account of collaborativeupdating is presented in these terms.Jon Dorling’s bayesian solution of the Duhem-Quine “holism” prob-lem is sketched in sec. 4.We finish with a brief look at the historical setting of radical prob-abilism (sec. 5), and an indication of how “real” probabilities can beaccomodated in subjectivistic terms (sec. 6).11 Judgmental (“Subjective”) ProbabilityYour “subjective” probability is not something fetched out of the sky on a whim; it is your actual judgment, normally representing whatyou think your judgment should be, even if you do not regard it asa judgment that everyone must share on pain of being wrong in onesense or another.1.1 Probabilities from statistics: MinimalismWhere do probabilistic judgments come from? Statistical data are aprime source; that is the truth in frequentism. But that truth must beunderstood in the light of certain features of judgmental probabilizing,e.g., persistence, as you learn the relative frequency of truths in asequence of propositions, of your judgment that they all have the sameprobability. That is an application of the following theorem of theprobability calculus.1Law of Little Numbers. In a finite sequence of propo-sitions that you view as equiprobable, if you are sure thatthe relative frequency of truths is p, then your probabilityfor each is p.Then if, judging a sequence of propositions to be equiprobable, youlearn the relative frequency of truths in a way that does not changeyour judgment of equiprobability, your probability for each propositionwill agree with the relative frequency.2The law of little numbers can be generalized to random variables:Law of Short Run Averages. In a finite sequence ofrandom variables for which your expections are equal, if1See Jeffrey (1992) pp. 59-64. The name “Law of Little Numbers” is a joke,but I know of no generally understood name for the theorem. That theorem, likethe next (the “Law of Short Run Averages”, another joke) is quite trivial; both areimmediate consequences of the linearity of the expectation operator. Chapter 2 ofde Fineti (1937) is devoted to them. In chapter 3 he goes on to a mathematicallydeeper way of understanding the truth in frequentism, in terms of “exchangeability”of random variables (sec. 1.2, below).2To appreciate the importance of the italicized caveat, note that if you learn therelative frequency of truths by learning which propositions in the sequence are true,and which false, then those probabilities will be zeros and ones instead of averagesof those zeros and ones.2you know only their arithmetical mean, then that is yourexpectation of each.Then if, while requiring your final expectations for a sequence of mag-nitudes to be equal, you learn their mean value in a way that doesnot lead you to change that requirement, your expectation of each willagree with that mean.3Example: Guessing Weight. Needing to estimate the weight of some-one on the other side of a chain link fence, you select ten people onyour side whom you estimate to have the same weight as that eleventh,persuade them to congregate on a platform scale, and read their totalweight. If the scale reads 1080 lb., your estimate of the eleventh per-son’s weight will be 108 lb.—if nothing in that process has made yourevise your judgment that the eleven weights are equal.4This is a frequentism in which judgmental probabilities are seenas judgmental expectations of frequencies, and in which the Law ofLittle Numbers guides the recycling of observed frequencies as proba-bilities of unobserved instances. It is to be distinguished both from theintelligible but untenable finite frequentism that simply identifies prob-abilities with actual frequencies (generally, unknown) when there areonly finitely many instances overall, and from the unintellible long–runfrequentism that would see the observed instances as a finite fragmentof an infinite sequence in which the infinitely long run inflates expec-tations into certainties that sweep judgmental probabilities under theendless carpet.51.2 Probabilities from statistics: Exchangeability6On the hypotheses of (a) equiprobability and (b) certainty that therelative frequency of truths is r, the the Law of Little Numbers identi-3If you learn the individual values and calculate the mean as their average with-out forgetting the various values, you have violated the caveat (unless it happensthat all the values were the same), for what you learned will have shown you thatthey are not equal.4Note that turning statistics into probabilities or expectations in this way re-quires neither conditioning nor Bayes’s theorem, nor does it require you to haveformed particular judgmental probabilities for the propositions or particular esti-mates for the random variables prior to learning the relative frequency or mean.5See Jeffrey (1992) chapter 11.6See chapters 3 and 5 of Finetti (1937), (1980).3fied the probability as r. Stronger conclusions follow from the strongerhypothesis ofexchangeability: You regard propositions H1,...,Hnas exchangeable when, for any particular t of them, yourprobability that they are all true and the other f


Loading Unlocking...
Login

Join to view Epistemology Probabilized and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Epistemology Probabilized and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?