Markov Processes Markov Chains So what is a Markov process or Markov chain You might call it a short cut When we worked the previous problems where we had chains of probability we had to build tree diagrams or extensive tables to calculate the probability after a number of steps With the Markov process we can get it all done through matrix arithmetic Formally we define a Markov Chain to be a sequence of experiments each of which results in one of a finite number of states 1 2 3 m There is a similarity here to a Bernoulli trial but these are not the same critters as you will see from the examples The first step in working with these chains is to create a Transition Matrix P This is a square matrix whose entries represent the probability of moving from one state to another We let pij represents the probability of moving from state i to state j in one observation step s1 s 2 For example look at the matrix to the right It tells us all of the following If we start in state 1 s1 after one step the probability of staying in s1 is 85 and the probability of moving to s2 is 15 If we start in state 2 s2 then after one step the probability of staying there is 55 and the probability of moving to s1 is 45 s1 85 15 s 2 45 55 Notice that each row in a transition matrix is a probability distribution so the sum of the entries in the row is 1 We can demonstrate the same information by a tree process but once we get past about two steps through the process it becomes a forest Here s how it would look in a video demonstration Markov Maps In the video you saw the question marks this signifies the initial probability distribution v 0 This is a 1 n row vector if there are n states whose entries represent the probability of starting in each respective state For example if there were two states and we had an equal chance of starting in either one then the initial probability distribution would be v 0 0 5 0 5 If we were guaranteed to start in state 1 it would be v 0 1 0 If we were guaranteed to start in state 2 it would be v 0 0 1 The probability distribution after k steps observations or stages is denoted as follows v k and can be calculated as v k v k 1 P v 0 P k where P is the transition Matrix Let s set up a couple of transition matrices then manipulate them to see what the future will be The Markov process is a great predictive tool for market analysis or psychological evaluation when we have reliable statistics about the probabilities involved Arizona State University Department of Mathematics and Statistics 1 of 4 Markov Processes Example 1 There are two insurance companies in town Big Mega Inc and Mom s Insurance Everyone in town has one or the other Every year 5 of Big Mega s customers switch to Mom s and 3 of Mom s customers move to Big Mega Create the Transition matrix P The matrix is to the right I chose to label the states to remind me of where they came from Generically we will use s1 s2 Sn in homework Notice that we have complementary situations provided for the switchers If 5 move then 95 remain with Big Mega So where does the initial probability distribution come into this Suppose right now Big Mega has 94 of the customers in town and Mom s has the rest find the initial probability distribution BM BM MI MI 95 05 03 97 v 0 0 94 0 06 Again the result is to the right 1 So after one year what percentage of the customers will each have The calculation is easy especially with a calculator Just find v 1 v 0 P k 0 8948 0 1052 2 After 5 years what percentage will each have Still easy Just find v 5 v 0 P 5 0 747381 0 252618 Notice that these are extended decimal forms They are repeating but good luck finding the denominator easily We will let the calculator do the work and reflect at least 6 decimal places Our anser is that about 74 7381 will be insured by Mig Mega while about 25 2618 will be insured by Mom s Insurance Notice that I used about The two results do not add to 100 since we are not seeing all the significant digits in the calculation Example 2 A company sells three models of cars call them A B and C They have found that the transition matrix below describes how customers change from one model to the other annually Notice that this is a valid transition matrix We can interpret it to say that customer satisfaction with model A is low since only 25 are willing to stay with it Of the remainder 35 move up to B and 40 move to C Customers are more satisfied with model B since 65 stick with it Only 10 would fall back to A while 25 move up to C A B C A 25 35 40 B 10 65 25 C 0 0 1 It does have the curious feature that once someone buys model C they never leave it We ll talk more about this in a later lesson 2 of 4 Arizona State University Department of Mathematics and Statistics Markov Processes 1 What is the probability that someone will own model B after three years if the initial probability distribution is uniform Recall that uniform means that all states are equally likely Hence each has That gives us p 1 3 v 0 1 1 1 3 3 3 We need to calculate v 3 v 3 25 35 40 v 0 P 3 1 1 1 10 65 25 3 3 3 0 0 1 331 757 1231 8000 8000 8000 3 Again notice that this time I got the denominator I used the TI 83 84 Math Frac operator and it found it for me Whenever possible use this option It will simplify your results and give an exact result when it works I did try it on the previous example It failed me Example 3 A company produces 4 products call them A B C D It discovers that customers prefer them in the weightings of 2 3 5 10 if they start in with product A However starting with product B the weightings change to 5 3 2 10 Starting with product C we see 4 4 10 2 And finally when they start with product D the find this weightings 2 2 2 14 1 Create the transition matrix Since the weightings provided for the first row is 2 3 5 10 this creates probabilities of 2 20 3 20 5 20 10 A These were reduced to decimal values as shown in the table The other rows were created similarly 2 …
View Full Document