DOC PREVIEW
Duke CPS 111 - Markov Chains

This preview shows page 1-2-3-18-19-37-38-39 out of 39 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 39 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 39 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 39 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 39 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 39 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 39 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 39 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 39 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 39 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Tutorial: Markov ChainsSteve GuFeb 28, 2008Outline• Markov chain• Applications– Weather forecasting– Enrollment assessment– Sequence generation– Rank the web page– Life cycle analysis• SummaryHistory• The origin of Markov chains is due to Markov, a Russian mathematician who first published in the Imperial Academy of Sciences in St. Petersburg in 1907, a paper studying the statistical behavior of the letters in Onegin, a well known poem of Pushkin.A Markov Chain"0""1"P01P11P10P00Transition Probability TableP P P PP P PP P P11 12 1321 22 2331 32 33P = 0.7 P = 0.2 P = 0.1P = 0. P = 0.6 P = 0.4P = 0.3 P = 0.5 P = 0.211 12 1321 22 2331 32 33P i n j n and Pij ijjn 0 1 1 11, = ,..., ; = ,..., =Example 1: Weather Forecasting[1]Weather Forecasting• Weather forecasting example:– Suppose tomorrow’s weather depends on today’s weather only.– We call it an Order-1 Markov Chain, as the transition function depends on the current state only.– Given today is sunny, what is the probability that the coming days are sunny, rainy, cloudy, cloudy, sunny ?– Obviously, the answer is : (0.5)(0.4)(0.3)(0.5) (0.2) = 0.0054sunny rainy0.50.40.30.2cloudy0.10.30.30.40.5Weather Forecasting• Weather forecasting example:– Given today is sunny, what is the probability that it will be rainy 4 days later?– We only knows the start state, the final state and the input length = 4– There are a number of possible combinations of states in between.sunny rainy0.50.30.40.30.30.40.10.2cloudy0.5Weather Forecasting• Weather forecasting example:– Chapman-Kolmogorov Equation: – Transition Matrix: sunny rainy0.50.30.40.30.30.40.10.2cloudy0.55.03.02.03.04.03.01.04.05.00)()()(kmkjnikmnijPPPs r csrcWeather Forecasting• Weather forecasting example:– Two days:– Four days:sunny rainy0.50.30.40.30.30.40.10.2cloudy0.55.03.02.03.04.03.01.04.05.025.03.02.03.04.03.01.04.05.025.03.02.03.04.03.01.04.05.05.03.02.03.04.03.01.04.05.036.035.029.030.037.033.022.039.039.0(00 x 01) + (01 x 11) + (02 x 21)  012984.03686.03330.02916.03706.03378.02820.03734.03446.0Weather Forecasting• Weather forecasting example:– What is the probability that today is cloudy?– There are infinite number of days before today. – It is equivalent to ask the probability after infinite number of days.– We do not care the “start state” as it brings little effect when there are infinite number of states.– We call it the “Limiting probability” when the machine becomes steady.sunny rainy0.50.30.40.30.30.40.2cloudy0.50.1Weather Forecasting• Weather forecasting example:– Since the start state is “don’t care”, instead of forming a 2-D matrix, the limiting probability is express a a single row matrix :– Since the machine is steady, the limiting probability does not change even it goes one more step.sunny rainy0.50.30.40.30.30.40.2cloudy0.5 210,,0.1Weather Forecasting• Weather forecasting example:– So the limiting probability can be computed by:– We have  probability that today is cloudy = sunny rainy0.50.30.40.30.30.40.2cloudy0.50.1 210,,5.03.02.03.04.03.01.04.05.0 210,, )6218,6223,6221(,,2106218Example 2: Enrollment Assessment [1]Undergraduate Enrollment ModelGraduateFreshmen Sophomore Junior SeniorStop OutState Transition ProbabilitiesFr So Jr Sr S/O GrFr .2 .65 0 0 .14 .01So 0 .25 .6 0 .13 .02TP= Jr 0 0 .3 .55 .12 .03Sr 0 0 0 .4 .05 .55S/O 0.1 0.1 0.4 0.1 0.3 0Gr 0 0 0 0 0 1Enrollment AssessmentGraduateFreshmen Sophomore Junior SeniorStop OutGiven:Transition probability table & Initial enrollment estimation, we can estimate the number of students at each time pointFr So Jr Sr S/O GrFr .2 .65 0 0 .14 .01So 0 .25 .6 0 .13 .02TP= Jr 0 0 .3 .55 .12 .03Sr 0 0 0 .4 .05 .55S/O 0.1 0.1 0.4 0.1 0.3 0Gr 0 0 0 0 0 1Example 3: Sequence Generation[3]Sequence GenerationMarkov Chains as Models ofSequence Generation• 0th-order• 1st-order• 1th-order• 2• 2nd-order           NiiisssssssssP211231211|pp|p|pp            NiiiissssssssssssssP31221324213212|pp|p|pp          NiisssssP13210pppp 4321sssss              NiiisssactatttsP2111|pp|p|p|pp              NiiiisssssacgtacttattsP312212|pp|p|p|pp              NiisgcattsP10pppppp ttacggts A Fifth Order Markov ChainExample 4: Rank the web pagePageRankHow to rank the importance of web pages?PageRankhttp://en.wikipedia.org/wiki/Image:PageRanks-Example.svgPageRank: Markov ChainFor N pages, say p1,…,pNWrite the Equation to compute PageRank as:where l(i,j) is define to be:PageRank: Markov Chain• Written in Matrix Form:                           1122N-1 N-1NNPR(p ,n +1) PR(p ,n)l(1,1) l(1,2) l(1,N)PR(p ,n +1) PR(p ,n )l(2,1) l(2,2) l(2,N)PR(p ,n +1) PR(p ,n)l(N,1) l(N,N - 1) l(N, N)PR(p ,n + 1) PR(p ,n)Example 5: Life Cycle Analysis[4]How to model life cycles of Whales?http://www.specieslist.com/images/external/Humpback_Whale_underwater.jpgLife cycle analysiscalf immature mature mom Post-momdeadIn real application, we need to specify or learn the transition probability tableJune 2006 Hal Caswell -- Markov Anniversary Meeting 30Application: The North Atlantic right whale (Eubalaena glacialis)June 2006 Hal Caswell -- Markov Anniversary Meeting 31calvingfeedingEndangered, by any standardN < 300 individualsMinimal recovery since 1935Ship strikesEntanglement with fishing gearJune 2006 Hal Caswell -- Markov Anniversary Meeting 322030: died October 1999entanglement1014 “Staccato” died April 1999 ship strikeMortality and serious injury due to entanglement and ship strikesJune 2006 Hal Caswell --


View Full Document

Duke CPS 111 - Markov Chains

Download Markov Chains
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Markov Chains and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Markov Chains 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?