This preview shows page 1-2-3-21-22-23-43-44-45 out of 45 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

BalanceSuppose you have eight billiard balls. One of themis defective -- it weighs more than the others.How do you tell, using a balance, which ball isdefective in two weighings?Information TheoryHow do you define the information a message carries?How much information does a message carry? How much of a message isredundant?How do we measure information and what are its units?How do we model a transmitter of messages?What is the average rate of information a transmitter generates?How much capacity does a communication channel have (with a given dataformat and data frequency)?Can we remove the redundancy from a message to fill the capacity of achannel? (lossless compression)How much can we compress a message and still exactly recover message?How does noise affect the capacity of a channel?Can we use redundancy to accurately recover a signal sent over a noisyline? (error correction)Information Theorymessage1message2message3…ORsymbol1symbol2symbol3…Informationsourcetransmitter (encode)receiver(decode)destinationmessage signal messagenoisesourceInformation source selects a desired message from a set of possible messages OR selects a sequence of symbols from a set of symbolsto represent a message. communication channelANDmessage1=symbol1, symbol2message2=symbol3, symbol5Destination decides which messageamong set of (agreed) possible messages,the information source sent.Why are we interested in Markov Models?We can represent an information source as an enginecreating symbols at some rate according to probabilisticrules. The Markov model represents those rules astransition probabilities between symbols.Based on these probabilities, we can define the amount of information, I, that a symbol carriesand what the average rate of information or entropy,H, a system generates. In the long term, each symbol has a certain steady state probability.ABC1/21/104/51/51/22/51/2ACBBA…€ vss=131627227[ ]0Discrete Markov Chainhttp://people.hofstra.edu/faculty/Stefan_Waner/RealWorld/Summary8.htmlTransition MatrixA Markov system (or Markov process or Markov chain) is a system that can be in one of several(numbered) states, and can pass from one state to another each time step according to fixed probabilities.If a Markov system is in state i, there is a fixed probability, pij, of it going into state j the next time step, andpij is called a transition probability.A Markov system can be illustrated by means of a state transition diagram, which is a diagram showingall the states and transition probabilities.The matrix P whose ijth entry is pij is called the transition matrix associated with thesystem. The entries in each row add up to 1. Thus, for instance, a 2 2 transition matrix Pwould be set up as in the following figure.Discrete Markov Chain1122330.2 0.8 00.4 0 0.60.5 0.35 0.15ToFromArrows originating in State 1Arrows originating in State 2Arrows originating in State 3120.350.60.40.80.20.530.151-step DistributionDistribution After 1 Step: ! vPIf v is an initial probability distribution vector and P is the transition matrix for a Markovsystem, then the probability vector after 1 step is the matrix product, vP.€ v = [927,1627,227]€ P =04515121201225110          € vP =131627227[ ]ip(i)A€ 927B€ 1627C€ 227jABCA€ 0€ 45€ 15B€ 12€ 12€ 0iC€ 45€ 45€ 45pi(j)Distributionafter 1 stepTransition matrixinitial probabilities transition probabilitiesInitial probabilityvectorn-steps DistributionDistribution After 1 Step: ! vPIf v is an initial probability distribution vector and P is the transition matrix for a Markovsystem, then the distribution vector after 1 step is the matrix product, vP.Distribution After 2 Steps: ! vP2The distribution one step later, obtained by again multiplying by P, is given by (vP)P = vP2.Distribution After n Steps: ! vPnSimilarly, the distribution after n steps can be obtained by multiplying v on the right by P ntimes, or multiplying v by Pn. (vP)PP…P = vPnThe ijth entry in Pn is the probability that the system will pass from state i to state j in n steps.Stationaryvx + vy + vz + . . .=1A steady state probability vector is then given byIf the higher and higher powers of P approach a fixed matrix , werefer to as the steady state or long-term transition matrix.What happens as number of steps n goes to infinity?VssP= Vssvss=[vx vy vz …]vss=[vx vy vz …]vss=[vx vy vz …]n+1 equationsn unknowns€ P∞€ P∞€ P∞=vxvyvzvxvyvzvxvyvz         Examples Let and be an initial probability distribution.!€ v = 0.2 0.4 0.4[ ]€ P =0.2 0.8 00.4 0 0.60.5 0.5 0          € vP = 0.2 0.4 0.4[ ]0.2 0.8 00.4 0 0.60.5 0.5 0          = 0.4 0.36 0.24[ ]Then the distribution after one step is given by32123…120.50.60.40.80.20.530.2(0.2)+(0.4)(0.4)+(0.4)(0.5)=0.04+0.16+0.20=0.4The distribution after one step is given by !The two-step distribution one step later is given byTo obtain the two-step transition matrix, we calculateThus, for example, the probability of going from State 3 to State 1 in two steps is givenby the 3,1-entry in P2, namely 0.3.€ vP = 0.2 0.4 0.4[ ]0.2 0.8 00.4 0 0.60.5 0.5 0          = 0.4 0.36 0.24[ ]€ vP2= vP( )P = 0.4 0.36 0.24[ ]0.2 0.8 00.4 0 0.60.5 0.5 0          = 0.344 0.44 0.216[ ]€ P2=0.2 0.8 00.4 0 0.60.5 0.5 0          0.2 0.8 00.4 0 0.60.5 0.5 0          =0.36 0.16 0.480.38 0.62 00.3 0.4 0.3         € vssP = vss→ vxvyvz[ ]0.2 0.8 00.4 0 0.60.5 0.5 0          = vxvyvz[ ]vx+ vy+ vz= 1The steady state distribution is given by€ 0.2vx+ 0.4vy+ 0.5vz= vx0.8vx+ 0.5vz= vy0.6vy= vzvx+ vy+ vz= 1€ vss= 0.354 0.404 0.242[ ]€ P∞=0.354 0.404 0.2420.354 0.404 0.2420.354 0.404 0.242          steady state distributionip(i)A€ 927B€ 1627C€ 227jABCA€ 0€ 45€ 15B€ 12€ 12€ 0iC€ 12€ 25€ 110pi(j)Shannon & Weaver pg.41jABCA€ 0€ 415€ 115B€ 827€ 827€ 0iC€ 127€ 4135€ 1135p(i,j)p(A,A)=p(B,C)=0; AA, BC never occursp(B,A) occurs most often; 4/15 timesDigram probabilitiesp(i,j)=p(i)pi(j)What are the relative frequencies of the combination of symbolsij=AA,AB,AC… (digram)? What is the joint probability p(i,j)?Why are we interested in Markov


View Full Document

MIT MAS 160 - Balancr

Download Balancr
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Balancr and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Balancr 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?