DOC PREVIEW
Duke CPS 102 - Lecture

This preview shows page 1-2 out of 7 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 7 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Today’s topicsExpectation ValuesDerived Random VariablesLinearity of Expectation ValuesVariance & Standard DeviationEntropyVisualizing EntropyCompSci 102 © Michael Frank 13.1Today’s topics•ProbabilityProbability–Expected valueExpected value•Reading: Sections 5.3Reading: Sections 5.3•UpcomingUpcoming–Probabilistic inferenceProbabilistic inferenceCompSci 102 © Michael Frank 13.2Expectation Values•For any random variable For any random variable VV having a numeric domain, its having a numeric domain, its expectation valueexpectation value or or expected valueexpected value or or weighted average weighted average valuevalue or or (arithmetic) mean value(arithmetic) mean value ExEx[[VV]], under the , under the probability distribution probability distribution Pr[Pr[vv] = ] = pp((vv)), is defined as , is defined as •The term “expected value” is very widely used for this.The term “expected value” is very widely used for this.–But this term is somewhat misleading, since the “expected” But this term is somewhat misleading, since the “expected” value might itself be totally unexpected, or even impossible!value might itself be totally unexpected, or even impossible!•E.g.E.g., if , if pp(0)=0.5(0)=0.5 & & pp(2)=0.5(2)=0.5, then , then Ex[Ex[VV]=1]=1, even though , even though pp(1)=0(1)=0 and so and so we know that we know that VV≠1≠1!!•Or, if Or, if pp(0)=0.5(0)=0.5 & & pp(1)=0.5(1)=0.5, then , then Ex[Ex[VV]=0.5]=0.5 even if even if VV is an integer is an integer variable!variable!.)(:][:][:ˆ][∑∈⋅≡≡≡VvpvpvVVVdomExExCompSci 102 © Michael Frank 13.3Derived Random Variables•Let Let SS be a sample space over values of a random be a sample space over values of a random variable variable VV (representing possible outcomes). (representing possible outcomes). •Then, any function Then, any function ff over over SS can also be considered can also be considered to be a random variable (whose actual value to be a random variable (whose actual value ff((VV)) is is derived from the actual value of derived from the actual value of VV).).•If the range If the range RR = = rangerange[[ff]] of of ff is numeric, then the is numeric, then the mean value mean value ExEx[[ff]] of of ff can still be defined, as can still be defined, as ∑∈⋅==Sssfspff )()(][ˆExCompSci 102 © Michael Frank 13.4Linearity of Expectation Values•Let Let XX11, , XX22 be any two random variables be any two random variables derived from the derived from the samesame sample space sample space SS, and , and subject to the same underlying distribution. subject to the same underlying distribution. •Then we have the following theorems:Then we have the following theorems:ExEx[[XX11++XX22] = ] = ExEx[[XX11] + ] + ExEx[[XX22]]ExEx[[aXaX11 + + bb] = ] = aaExEx[[XX11] + ] + bb•You should be able to easily prove these for You should be able to easily prove these for yourself at home.yourself at home.CompSci 102 © Michael Frank 13.5Variance & Standard Deviation•The The variancevariance VarVar[[XX] = ] = σσ22((XX)) of a random variable of a random variable XX is the expected value of the is the expected value of the squaresquare of the of the difference between the value of difference between the value of XX and its and its expectation value expectation value ExEx[[XX]]::•The The standard deviationstandard deviation or or root-mean-squareroot-mean-square (RMS) (RMS) differencedifference of of XX is is σσ((XX) :≡ ) :≡ VarVar[[XX]]1/21/2..( )∑∈−≡SspspXsXX )(][)(:][2ExVarCompSci 102 © Michael Frank 13.6Entropy•The The entropyentropy HH of a probability distribution of a probability distribution pp over a sample over a sample space space SS over outcomes is a measure of our over outcomes is a measure of our degree of uncertaintydegree of uncertainty about the actual outcome.about the actual outcome.–It measures the expected amount of increase in our known information It measures the expected amount of increase in our known information that would result from learning the outcome.that would result from learning the outcome.•The base of the logarithm gives the corresponding unit of The base of the logarithm gives the corresponding unit of entropy; base 2 entropy; base 2 → 1 bit, base → 1 bit, base ee → 1 nat (as before) → 1 nat (as before)–1 nat is also known as “Boltzmann’s constant” 1 nat is also known as “Boltzmann’s constant” kkBB & as the “ideal gas & as the “ideal gas constant” constant” RR, and was first discovered physically, and was first discovered physically)(log)(][log:)(1spspppHSsp∑∈−−=≡ExCompSci 102 © Michael Frank 13.7 1 2 3 4 5 6 7 8 9 10 0 0.2 0.4 0.6 0.8 1 Probability State Index Sample Nonuniform vs. Uniform Probability Distributions 1 2 3 4 5 6 7 8 9 10 0 20 40 60 80 Improb- ability (1 out of N) State Index Improbability (Inverse Probability) 1 2 3 4 5 6 7 8 9 10 S1 0 1 2 3 4 5 6 7 Log Base 2 of Improb. State Index Log Improbability (Information of Discovery) 1 2 3 4 5 6 7 8 9 10 0 0.1 0.2 0.3 0.4 0.5 Bits State Index Boltzmann-Gibbs-Shannon Entropy (Expected Log Improbability) Visualizing


View Full Document

Duke CPS 102 - Lecture

Documents in this Course
Lecture

Lecture

34 pages

Lecture

Lecture

42 pages

Lecture

Lecture

46 pages

Lecture

Lecture

77 pages

Notes

Notes

17 pages

Notes

Notes

52 pages

Lecture 9

Lecture 9

72 pages

Lecture

Lecture

7 pages

Lecture

Lecture

11 pages

Lecture

Lecture

28 pages

Lecture

Lecture

25 pages

Forbes

Forbes

9 pages

Lecture

Lecture

53 pages

Lecture

Lecture

21 pages

Lecture 4

Lecture 4

54 pages

Lecture

Lecture

24 pages

Lecture

Lecture

46 pages

Lecture

Lecture

16 pages

Lecture

Lecture

46 pages

Graphs II

Graphs II

34 pages

Lecture

Lecture

81 pages

Lecture

Lecture

46 pages

Load more
Download Lecture
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?