DOC PREVIEW
UA ECON 520 - Elementary Probability Theory and Combinatorics

This preview shows page 1-2 out of 7 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Economics 520, Fall 2010Lecture Note 1: Elementary Probability Theory and Combinatorics1You should already be familiar with basic set-theoretic notation and operations, such as unions,intersections, complementation, empty set.Definition 1 The sample space (denoted by Ω) is the set of all possible outcomes of an experiment.Example 1: Tossing a die. There are six outcomes in the sample space, corresponding to the numberon top of the die, so we can take Ω = {1, 2, 3, 4, 5, 6}.Definition 2 An event (denoted by E) is a collection of possible outcomes of an experiment, that is,a subset of the sample space.Example 1 continued: Possible events include “an odd number”, E1= {1, 3, 5}, “an even number”,E2= {2, 4, 6}, or “a number less than 3”, E3= {1, 2}.Definition 3 Two events E1and E2are disjoint if their intersection E1∩ E2is equal to the emptyset ∅.Example 1 continued: E1and E2are disjoint because their intersection is empty, but E1and E3arenot disjoint because their intersection is {1}.Definition 4 If the sets E1, E2, . . . are pairwise disjoint and their union ∪iEiis equal to the samplespace, the collection E1, E2, . . . forms a partition of the sample space.Example 1 continued: E1and E2form a partition: they are disjoint and their union is the entiresample space Ω.Example 2: Relative humidity on a randomly selected day. In this case we might take Ω = [0, 1],the unit interval. (In practice, the relevant quantity might only be measured with finite precision,but it is often a convenient fiction to suppose that the quantity can take on a continuum of values.)The eventsE1= [0, .1), E2= [.1, .7), E3= [.7, 1]form a partition of the sample space. Of course there are many other possible ways to partition theunit interval.1This note and many of the later lecture notes are based on notes written by Guido Imbens. I thank him for permissionto use his material in this course.1Loosely speaking, a probability distribution assigns probabilities between 0 and 1 to different possi-ble events. In Example 1, a fair die would assign probability 1/6 to the event {1} which correspondsto rolling a “1.” So we want to define a set of events over which probabilities will be assigned:Definition 5 A collection B of subsets of Ω is a sigma-algebra if it satisfies the following three Note: CB uses“Borel field”instead of“sigma-algebra.”conditions:1. The empty set ∅ is contained in B.2. If E ∈ B then its complement Ec= {ω ∈ Ω|ω /∈ E} is also in B.3. B is closed under countable unions, that is, if E1, E2, . . . are all in B, then so is ∪iEi.One possible sigma-algebra (in fact the smallest possible one) is B1= {∅, Ω}. This works for anyΩ, but this is not a very interesting sigma-algebra.For Example 1, a possible sigma-algebra is B2= {∅, Ω, {1}, {2, 3, 4, 5, 6}}. You should verify thatthis satisfies the three conditions of the definition.Another possible sigma-algebra for Example 1 is the power set, the set of all subsets of Ω. Sometimes writtenas 2ΩFor a given sample space, we’d like the probability distribution to assign probabilities to as manydifferent events as possible. In Example 1, it is possible to assign probabilities consistently to everymember of the power set. For example, with a fair die the probability of E = {1, 2} is equal to theprobability of {1} plus the probability of {2}, i.e. 1/6+1/6 = 1/3.However, in Example 2 when Ω = [0, 1], it turns out that there is no way to do this for every possiblesubset of Ω. In this case, the power set is so large that we have to limit our attention to a large, but This is explained inBillingsley,Probability andMeasure, p.45.technically more manageable, collection of sets.Definition 6 (Kolmogorov Axioms) Given a sample space Ω and an associated sigma-algebra B, aprobability function is a function P from B to the real line satisfying:1. (Nonnegativity) For all E ∈ B, P (E) ≥ 0.2. (Unit probability for the sample space) P (Ω) = 1.3. (Additivity of probability of disjoint sets) If E1, E2, . . . are pairwise disjoint, then P (∪iEi) =PiP (Ei).Remarks:21. The empty set ∅ is contained in B and therefore its complement Ω = ∅cis also contained inB. Hence the second condition is well defined.2. If the Borel field is finite, Condition 3 need only hold for finite unions.Example 1 continued: For a fair die,P (E) =number of outcomes in Etotal number of outcomes in Ω.This assigns probability 1/6 to each of the six outcomes. Alternatively we can assign any othernonnegative number to each of the six outcomes provided they add up to one.An immediate implication of the Kolmogorov axioms is thatP (Ec) = 1 − P (E),because1 = P (Ω) = P (E) + P (Ec).Therefore:P (∅) = P (Ωc) = 1 − P (Ω) = 0.Another useful result: for any events E1and E2,P (E1∪ E2) = P (E1) + P (E2) − P (E1∩ E2).The proof, typical for this type of result, relies on creating pairwise disjoint sets for which one canadd up the probabities by the third axiom:P (E1∪ E2) = P(E1∩ Ec2) ∪ (Ec1∩ E2) ∪ (E1∩ E2)= P (E1∩ Ec2) + P (Ec1∩ E2) + P (E1∩ E2). (1)Also:P (E1) = P (E1∩ E2) + P (E1∩ Ec2),which, after rearranging, givesP (E1∩ Ec2) = P (E1) − P (E1∩ E2),which after substituting in (1) gives the desired result.You should read CB 1.2.2 for further results of this type, which help in calculating probabilities for3complicated events.Counting and ProbabilityEarly problems in the history of probability often involved games of chance where the probabilitiesfor basic outcomes were clear but the probabilities of interesting events were difficult to calculatebecause of the large number of basic outcomes for events of interest. A number of these problemscan be formulated as problems of drawing k objects with and without replacement out of a set of nwhile being or not being concerned with the ordering. Solving them requires counting the ways inwhich you can do this. As an example we consider the case where we have n = 4 objects, labelledA, B, C, and D, and wish to draw k = 2. Recall that n! = n × (n − 1) × (n − 2) × · · · × 1 is calledn factorial.Result 1 (ordered, with replacement) The total number of ways k objects can be drawn out of a setof n with replacement is nk.For the first draw there are n choices, for the second one there are again n choices and so on. In theexample, the set of outcomes is{AA, AB, AC, AD, BA, BB, BC, BD, CA, CB, CC, CD, DA, DB, DC, DD},with sixteen elements.Result 2


View Full Document

UA ECON 520 - Elementary Probability Theory and Combinatorics

Download Elementary Probability Theory and Combinatorics
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Elementary Probability Theory and Combinatorics and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Elementary Probability Theory and Combinatorics 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?