So today’s topic as I mentioned earlier is “Markov Chains”. I shall begin with a definition and then will prove a theorem which can also be taken as a simplified definition.

Definition(Stochastic matrix) A Stochastic matrix is a real valued matrix parametrized by such that

a) for all

b) for all .

Another definition which would be used soon is

Definition (Markov Chains) Let be a finite set with probability distribution and stochastic matrix . A Markov chain with sample space , probability distribution , transition matrix is sequence of random variables , where is a finite measure space such that:

a) .

b) .

Theorem~(Equivalent property of being Markov) A finite sequence of variables $, with probability distribution on $Y$ and on is a Markov chain if .

and

,

Proof: For the proof we see that if is the Markov chain then and further and therefore is equal to . Hence the given formula can be easily proved by induction. Conversely suppose we are given , Then by summing over all , we get for all . Rest of the theorem follows just by easy calculations.

**An intuitive approach**: We consider time to be discrete. Then if at time our point is at , then the probability of it being at at time is equal to . Thus if the initial stage of the particle is , then the probability that it follows the path is . Hence

=$latex \sum{x_0 \in X} \nu(x_0) p^k(x_0, x)$

is the probability that at time particle will be at

### Like this:

Like Loading...

*Related*

Pingback: Coupling from the Past | Eventually Almost Everywhere