DOC PREVIEW
Berkeley STAT 134 - Lecture Notes

This preview shows page 1-2-3 out of 10 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Events A and B are independent if:knowing whether A occured does not change the probability of B.Mathematically, can say in two equivalent ways:P(B|A)=P(B)P(A and B)=P(B ∩ A)=P(B) × P(A).Important to distinguish independenc e from m utual ly exclusive whichwould say B ∩ A is empty (cannot happen) .Example. Deal 2 cards from deckAfirstcardisAceC second card is AceP(C |A)=351P(C )=452(last class).So A and C are dependent.Example. Throw 2 diceAfirstdielands1B second die shows larger number than first dieC both dice show same numberP(B|A)=56P(B)=?=1536by countingso A and B dependent.P(C |A)=16P(C )=636=16so A and C independent.Note 1: here B and C are mutually exclusive.Note 2: writing B�= ”second die shows smaller number than first die ”we haveP(B�)=P(B)bysymmetryP(B ∪ B�)=P(Cc)=1− P(C )=56giving a “non-counting” argument that P(B)=512.Example. Deal 1 card from deckAcardisAceScardisSpadeP(A)=452P(S)=1352P(A ∩ S)=152.Here P(A ∩ S)=P(A) × P(S)soindependent.Conceptual point.(a) In a fully-specified math model, two events are either dependent orindependent; can be checked by calc ul a ti on.(b) Often we use independence as an as s um pti on in making a model.For instance we assume that different die throws give indepen d en tresults. Most probability models one encounters in enginee ri n g or sciencehave some assumption of “bottom level” independence; bu t one n eed s tobe careful about which other events within the model are independent.(silly) Example.Throw 2 dice. If s um is at least 7 I show you the dice; if not, Idon’t.A: I show you first die lands 1B: I show you second die lands 1P(A)=136, P(B)=136, P(A ∩ B)=0so A and B dependent.Conceptual point. This illustrates a subtle poin t: being told by atruthful person that “A happened” is not (for probability/statisticspurposes) exactly the same as “knowing A happened ”.[car accident example]Systems of componentsWill show logic diagrams: system works if there is some pathleft-to-right which passes only though worki n g compon en ts.Assume components work/fail indepen de nt ly,P(Ciworks ) = pi, P(Cifails ) = 1 − pi.Note in practice the independence assumpt ion i s u su all y u n re al i s ti c.Math question: calculate P( system works ) in terms of the numbers piand the network structure.Example: “in series”.[picture on board]P(systemworks)=p1p2p3.Example: “in parallel”.[picture on board]P(systemfails)=(1− p1)(1 − p2)(1 − p3).P(systemworks)=1− (1 − p1)(1 − p2)(1 − p3).More complicated exampl e:[picture on board]We could write out all 16 combinations; instead l et’ s cond i ti on onwhether or not C1works.P(system works) = P(system works|C1works)P(C1works)+ P(system works|C1fails)P(C1fails)[continue on board]Example: Deal 4 cards. What is chance we get exactly one Spade?event 1st 2nd 3rd 4thF1SNNNF2NSNNF3F4NNNS[board: repeated conditioning]P(F1)=1352×3951×3850×3749P(F1)=P(F2)=P(F3)=P(F4)P(exactly one Sp a de ) = P(F1or F2or F3or F4))= P(F1)+P(F2)+P(F3)+P(F4)=4× P(F1) ≈ 44%.Example: Deal 4 cards. What is chance we get one card of each suit?event 1st 2nd 3rd 4thA1CDHSA2CDSH..........P(A1)=1352×1351×1350×1349P(A1)=P(A2)=...Number of possible orders = 4 × 3 × 2 × 1 = 24 = 4!P(one card of each suit) = 24 × P(A1) ≈ 10.5%.Bayes rule: updating probabilities as new information is acq u i re d.(silly) Example There are 2 coi n s:one is fair: P(Heads) = 1/2; one is biased: P(Heads) = 9/10Pick one coin at random. Toss 3 times. Suppose we get 3 Heads. Whatthen is the chance that the coin we picked is the bia sed coin?Abstract set-up: Partition (B1, B2,...) of “alternate possibilities”.Know prior probabi l i ti es P(Bi).Then observe some event A happens (the “new information”) for whichwe know P(A|Bi). We want to calculate the posterior probabilitiesP(Bi|A).Bayes formula:P(Bi|A)=P(A|Bi)P(Bi)P(A|B1)P(B1)+P(A|B2)P(B2)+....[example above on


View Full Document

Berkeley STAT 134 - Lecture Notes

Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?