New version page

Duke CPS 102 - Today’s topics

Documents in this Course
Lecture

Lecture

34 pages

Lecture

Lecture

42 pages

Lecture

Lecture

46 pages

Lecture

Lecture

77 pages

Notes

Notes

17 pages

Notes

Notes

52 pages

Lecture 9

Lecture 9

72 pages

Lecture

Lecture

7 pages

Lecture

Lecture

11 pages

Lecture

Lecture

28 pages

Lecture

Lecture

25 pages

Forbes

Forbes

9 pages

Lecture

Lecture

53 pages

Lecture

Lecture

21 pages

Lecture 4

Lecture 4

54 pages

Lecture

Lecture

24 pages

Lecture

Lecture

46 pages

Lecture

Lecture

16 pages

Lecture

Lecture

7 pages

Lecture

Lecture

46 pages

Graphs II

Graphs II

34 pages

Lecture

Lecture

81 pages

Lecture

Lecture

46 pages

Load more
Upgrade to remove ads
Upgrade to remove ads
Unformatted text preview:

Today’s topicsWhy Probability?Random VariablesInformation CapacityExperiments & Sample SpacesEventsProbabilityFour Definitions of ProbabilityProbability: Frequentist DefinitionProbability: Bayesian DefinitionProbability: Laplacian DefinitionProbability: Axiomatic DefinitionProbabilities of Mutually Complementary EventsExample 1: Balls-and-UrnExample 2: Seven on Two DiceProbability of Unions of EventsMutually Exclusive EventsExhaustive Sets of EventsIndependent EventsConditional ProbabilityPrior and Posterior ProbabilityVisualizing Conditional ProbabilityConditional Probability ExampleBayes’ RuleCompSci 102 © Michael Frank 12.1Today’s topics•ProbabilityProbability–DefinitionsDefinitions–EventsEvents–Conditional probabilityConditional probability•Reading: Sections 5.1-5.3Reading: Sections 5.1-5.3•UpcomingUpcoming–Expected valueExpected valueCompSci 102 © Michael Frank 12.2Why Probability?•In the real world, we often don’t know In the real world, we often don’t know whether a given proposition is true or false.whether a given proposition is true or false.•Probability theory gives us a way to reason Probability theory gives us a way to reason about propositions whose truth is about propositions whose truth is uncertainuncertain..•It is useful in weighing evidence, It is useful in weighing evidence, diagnosing problems, and analyzing diagnosing problems, and analyzing situations whose exact details are unknown.situations whose exact details are unknown.CompSci 102 © Michael Frank 12.3Random Variables•A A “random variable”“random variable” VV is any variable whose is any variable whose value is unknown, or whose value depends on value is unknown, or whose value depends on the precise situation.the precise situation.–E.g.E.g., the number of students in class today, the number of students in class today–Whether it will rain tonight (Boolean variable)Whether it will rain tonight (Boolean variable)•Let the domain of Let the domain of VV be be domdom[[VV]]≡≡{{vv11,…,,…,vvnn}}–Infinite domains can also be dealt with if needed.Infinite domains can also be dealt with if needed.•The proposition The proposition VV==vvii may have an uncertain may have an uncertain truth value, and may be assigned a truth value, and may be assigned a probability.probability.CompSci 102 © Michael Frank 12.4Information Capacity•The The information capacityinformation capacity II[[VV]] of a random variable of a random variable VV with a finite domain can be defined as the logarithm (with with a finite domain can be defined as the logarithm (with indeterminate base) of the size of the domain of indeterminate base) of the size of the domain of VV, , II[[VV] :] :≡≡ log | log |domdom[[VV]|]|..–The log’s base determines the associated information unit!The log’s base determines the associated information unit!•Taking the log base 2 yields an information unit of 1 Taking the log base 2 yields an information unit of 1 bitbit b = log 2b = log 2..–Related units include the Related units include the nybblenybble N = 4 b = log 16N = 4 b = log 16 (1 hexadecimal digit), (1 hexadecimal digit),–and more famously, the and more famously, the bytebyte B = 8 b = log 256B = 8 b = log 256..•Other common logarithmic units that can be used as units of information:Other common logarithmic units that can be used as units of information:–the the natnat, or , or e-folde-fold n = log en = log e, , »widely known in thermodynamics as widely known in thermodynamics as Boltzmann’s constant kBoltzmann’s constant k..–the the belbel or or decadedecade or or order of magnitudeorder of magnitude ( (D = log 10D = log 10), ), –and the and the decibeldecibel or or dB = D/10 = (log 10)/10 dB = D/10 = (log 10)/10 ≈ log 1.2589≈ log 1.2589•Example:Example: An 8-bit register has 2 An 8-bit register has 288 = 256 possible values. = 256 possible values. –Its information capacity is thus: Its information capacity is thus: log 256 = 8 log 2 = 8 blog 256 = 8 log 2 = 8 b!!•Or Or 2N2N, or , or 1B1B, or , or loglogee256 256 ≈≈ 5.545 n 5.545 n, or , or loglog1010256 = 2.408 D256 = 2.408 D, or , or 24.08 dB24.08 dBCompSci 102 © Michael Frank 12.5Experiments & Sample Spaces•A (stochastic) A (stochastic) experimentexperiment is any process by which is any process by which a given random variable a given random variable VV gets assigned some gets assigned some particularparticular value, and where this value is not value, and where this value is not necessarily known in advance.necessarily known in advance.–We call it the “actual” value of the variable, as We call it the “actual” value of the variable, as determined by that particular experiment.determined by that particular experiment.•The The sample spacesample space SS of the experiment is just of the experiment is justthe domain of the random variable, the domain of the random variable, SS = dom[ = dom[VV]]..•The The outcomeoutcome of the experiment is the specific of the experiment is the specific value value vvii of the random variable that is selected. of the random variable that is selected.CompSci 102 © Michael Frank 12.6Events•An An eventevent EE is any set of possible outcomes in is any set of possible outcomes in SS……–That is, That is, EE  SS = = domdom[[VV]]..•E.g., the event that “less than 50 people show up for our next class” is E.g., the event that “less than 50 people show up for our next class” is represented as the set represented as the set {1, 2, …, 49}{1, 2, …, 49} of values of the variable of values of the variable VV = (# of = (# of people here next class)people here next class)..•We say that event We say that event EE occursoccurs when the actual value of when the actual value of VV is in is in EE, which may be written , which may be written VVEE..–Note that Note that VVEE denotes the proposition (of uncertain truth) denotes the proposition (of uncertain truth) asserting that the actual outcome (value of asserting that the actual outcome (value of VV) will be one of the ) will be one of the outcomes in the set outcomes in the set EE..CompSci 102 © Michael Frank 12.7Probability•The The probabilityprobability p p = Pr[= Pr[EE] ]  [0,1] [0,1] of an event of an event EE is is a real number representing our degree of certainty a real number representing our degree of certainty


View Full Document
Download Today’s topics
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Today’s topics and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Today’s topics 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?