DOC PREVIEW
UA ECON 520 - Elementary Probability Theory and Combinatorics

This preview shows page 1-2 out of 6 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Economics 520, Fall 2008Lecture Note 1: Elementary Probability Theory and Combinatorics1You should already be familiar with basic set-theoretic notation and operations, such asunions, intersections, complementation, empty set.Definition 1 The sample space (denoted by Ω) is the set of all possible outcomes of anexperiment.Example 1: Tossing a die. There are six outcomes in the sample space, corresponding tothe number on top of the die, so we can take Ω = {1, 2, 3, 4, 5, 6}.Definition 2 An event (denoted by E) is a collection of possible outcomes of an experiment,that is, a subset of the sample space.Example 1 continued: Possible events include “an odd number”, E1= {1, 3, 5}, “an evennumber”, E2= {2, 4, 6}, or “a number less than 3”, E3= {1, 2}.Definition 3 Two events E1and E2are disjoint if their intersection E1∩ E2is equal tothe empty set ∅.Example 1 continued: E1and E2are disjoint because their intersection is empty, but E1and E3are not disjoint because their intersection is {1}.Definition 4 If the sets E1, E2, . . . are pairwise disjoint and their union ∪iEiis equal tothe sample space, the collection E1, E2, . . . forms a partition of the sample space.Example 1 continued: E1and E2form a partition: they are disjoint and their union is theentire sample space Ω.Example 2: Relative humidity on a randomly selected day. In this case we might takeΩ = [0, 1], the unit interval. (In practice, the relevant quantity might only be measuredwith finite precision, but it is often a convenient fiction to suppose that the quantity cantake on a continuum of values.) The eventsE1= [0, .1), E2= [.1, .7), E3= [.7, 1]form a partition of the sample space. Of course there are many other possible ways topartition the unit interval.Loosely speaking, a probability distribution assigns probabilities between 0 and 1 to differentpossible events. In Example 1, a fair die would assign probability 1/6 to the event {1} whichcorresponds to rolling a “1.” So we want to define a set of events ove r which probabilitieswill be assigned:1This note and many of the later lecture notes are based on notes written by Guido Imbens. I thank himfor permission to use his material in this course.1Definition 5 A collection B of subsets of Ω is a sigma-algebra2if it satisfies the followingthree conditions:1. The empty set ∅ is contained in B.2. If E ∈ B then its complement Ec= {ω ∈ Ω|ω /∈ E} is also in B.3. B is closed under countable unions, that is, if E1, E2, . . . are all in B, then so is ∪iEi.One possible sigma-algebra (in fact the sm allest possible one) is B1= {∅, Ω}. This worksfor any Ω, but this is not a very interesting sigma-algebra.For Example 1, a possible sigma-algebra is B2= {∅, Ω, {1}, {2, 3, 4, 5, 6}}. You should verifythat this satisfies the three conditions of the definition.Another possible sigma-algebra for Example 1 is the power set, the set of all subsets of Ω.For a given sample space, we’d like the probability distribution to assign probabilities toas many different events as possible. I n Example 1, it is possible to assign probabilitiesconsistently to every member of the power set. For example, with a fair die the probabilityof E = {1, 2} is equal to the probability of {1} plus the probability of {2}, i.e. 1/6+1/6 =1/3.However, in Example 2 when Ω = [0, 1], it turns out that there is no way to do this forevery possible subset of Ω. In this case, the power set is so large that we have to limit ourattention to a large, but technically more manageable, collection of sets. (For more on this,see Billingsley, P., Probability and Measure, p.45).Definition 6 (Kolmogorov Axioms) Given a sample space Ω and an associated sigma-algebra B, a probability function is a function P from B to the real line satisfying:1. (Nonnegativity) For all E ∈ B, P (E) ≥ 0.2. (Unit probability for the sample space) P (Ω) = 1.3. (Additivity of probability of disjoint sets) If E1, E2, . . . are pairwise disjoint, thenP (∪iEi) =PiP (Ei).Remarks:1. The empty set ∅ is contained in B and therefore its complement Ω = ∅cis alsocontained in B. Hence the second condition is well defined.2. If the Borel field is finite, Condition 3 need only hold for finite unions.2Note: CB uses the term Borel field interchangeably with sigma-algebra. However, some mathematicstexts use Borel field more narrowly, to refer to a certain type of sigma-algebra that is generated by the Boreltop ology.2Example 1 continued: For a fair die,P (E) =number of outcomes in Etotal number of outcomes in Ω.This assigns probability 1/6 to each of the six outcomes. Alternatively we can as sign anyother nonnegative number to each of the six outcomes provided they add up to one.An immediate implication of the Kolmogorov axioms is thatP (Ec) = 1 − P (E),because1 = P (Ω) = P (E) + P (Ec).Therefore:P (∅) = P (Ωc) = 1 − P (Ω) = 0.Another useful result: for any events E1and E2,P (E1∪ E2) = P (E1) + P (E2) − P (E1∩ E2).The proof, typical for this type of result, relies on creating pairwise disjoint sets for whichone can add up the probabities by the third axiom:P (E1∪ E2) = P(E1∩ Ec2) ∪ (Ec1∩ E2) ∪ (E1∩ E2)= P (E1∩ Ec2) + P (Ec1∩ E2) + P (E1∩ E2). (1)Also:P (E1) = P (E1∩ E2) + P (E1∩ Ec2),which, after rearranging, givesP (E1∩ Ec2) = P (E1) − P (E1∩ E2),which after substituting in (1) gives the desired result.You should read CB 1.2.2 for further results of this type, which help in calculating proba-bilities for complicated events.Counting and ProbabilityEarly problems in the history of probability often involved games of chance where theprobabilities for basic outcomes were clear but the probabilities of interesting events weredifficult to calculate because of the large number of basic outcomes for events of interest.A number of these problems can be formulated as problems of drawing k objects with andwithout replacement out of a set of n while being or not being concerned w ith the ordering.Solving them requires counting the ways in which you can do this. As an example weconsider the case where we have n = 4 objects, labelled A, B, C, and D, and wish to drawk = 2. Recall that n ! = n × (n − 1) × (n − 2) × · · · × 1 is called n factorial.Result 1 (ordered, with replacement) The total n umber of ways k objects can be drawn outof a set of n with replacement is nk.3For the first draw there are n choices, for the second one there are again n


View Full Document

UA ECON 520 - Elementary Probability Theory and Combinatorics

Download Elementary Probability Theory and Combinatorics
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Elementary Probability Theory and Combinatorics and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Elementary Probability Theory and Combinatorics 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?