DOC PREVIEW
UI CS 448 - Usage Model & Markov Process

This preview shows page 1-2 out of 6 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

© 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Usage Model & Markov Process!Before dealing with the issue of usage models we need to cover/review the basics of Markov chains.–some of you have seen this material in other classes–we will only discuss the very basics–for details about Markov models, please refer to the general literature1 © 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Usage Model & Markov Process!Markov Analysis of Software Specifications–this can be used as a tool in determining “what is normal behavior”–anything “not normal” must deviate from the normal behavior in some way.!A stochastic process is a function whose values are random variables !The classification of a random process depends on different quantities–state space–index (time) parameter–statistical dependencies among the random variables X(t) for different values of the index parameter t.2© 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Markov Process!State Space–the set of possible values (states) that X(t) might take on.–if there are finite states => discrete-state process or chain–if there is a continuous interval => continuous process!Index (Time) Parameter–if the times at which changes may take place are finite or countable, then we say we have a discrete-(time) parameter process.–if the changes may occur anywhere within a finite or infinite interval on the time axis, then we say we have a continuous-parameter process.3 © 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Markov Process!In 1907 A.A. Markov published a paper in which he defined and investigated the properties of what are now known as Markov processes.!A Markov process with a discrete state space is referred to as a Markov Chain!A set of random variables forms a Markov chain if the probability that the next state is S(n+1) depends only on the current state S(n), and not on any previous states4© 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Markov Process!States must be –mutually exclusive–collectively exhaustive!Let Pi(t) = Probability of being in state Si at time t. !Markov Properties–future state prob. depends only on current state»independent of time in state»path to state5 © 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Markov Process!Let P(transition out of state i in !t) = !Mean time to transition (exponential holding times) !If "’s are not functions of time, i.e. if –homogeneous Markov Chain6© 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Markov Process!Accessibility–state Si is accessible from state Sj if there is a sequence of transitions from Sj to Si.!Recurrent State–state Si is called recurrent, if Si can be returned to from any state which is accessible from Si (in one step).!Non Recurrent–if there exists at least one neighbor with no return path.7 © 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Markov Process!sample chainWhich statesare recurrentor non-recurrent?8© 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Markov Process!Classes of States–set of states (recurrent) s.t. any state in the class is reachable from any other state in the class.–note: 2 classes must be disjoint, since a common state would imply that states from both classes are accessible to each other.!Absorbing State–a state Si is absorbing iff9 © 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Markov Process!Irreducible Markov Chain–a Markov chain is called irreducible, if the entire system is one class => there is no absorbing state => there is no absorbing subgraph, i.e. there is no absorbing subset of states!What if the chain is not irreducible?–what are the implications?10© 2008 A.W. Krings CS448/548 Survivable Systems and Networks, Sequence 14Markov Process!Steady State Solution–as time goes toward infinity!Transient Solution–when is this of


View Full Document

UI CS 448 - Usage Model & Markov Process

Download Usage Model & Markov Process
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Usage Model & Markov Process and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Usage Model & Markov Process 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?