Professional Documents
Culture Documents
Kristina King
Definition 1.2. A discrete random variable is one for which the set of possible values is
either finite or countably infinite.
Definition 1.3. The random variables X and Y are said to be independent if for all a, b,
P (X a, Y b) = P {X a}P {Y b}
Definition 1.7. When T is a countable set, the stochastic process is a discrete-time pro-
cess.
Example 2.1. Assume, for example, Xn denotes the price of a stock at the end of n trading
days. It would be unreasonable to assume that Xn is independent of Xn1 , Xn2 , ..., X0 .
Some processes, however, only depend on the present state, not all of the previous states.
is known as a Markov chain. That is, a Markov chain is a process for which the distribution
of a future state is only dependent on the present state, and independent of all previous states.
1
Definition 2.2. The one step transition probabilities from state i to j are given by
We can draw a state diagram for this Markov chain in the following way.
3 Chapman-Kolmogorov equations
Definition 3.1. The n-step transition probability, Pijn is the probability that a process
in state i will be in state j after n additional transitions. That is,
2
Theorem 3.1. The Chapman-Kolmogorov equations are given by, for all i, j and all n, m
0,
X
n+m
Pij = Pikn Pkj
m
k=0
Proof.
k=0 k=0
Theorem 3.2. Let P(n) be the matrix of n-step transition probabilities. Then P(n) = Pn .
That is, the n-step transition matrix can be obtained by multiplying the one-step transition
matrix by itself n times.
Proof. Let P(n) be the matrix of n-step transition probabilities. Then by the Chapman-
Kolmogorov equations, P(n+m) = P(n) P(m) , where standard matrix multiplication is used.
This is always possible because P will always be a square matrix. Note that P(2) = P(1+1) =
P(1) P(1) = PP = P2 . Assume it is true that P(n1) = Pn1 . Then by the Chapman-
Kolmogorov equations, P(n) = P(n1) P(1) . It follows, then, that P(n) = Pn1 P = Pn . By
induction, we have proven it to be true for all n 1.
Example 3.1. Reconsider Example 2.2. If the market was stagnant three weeks ago, what
is the probability it will not be stagnant this week?
The market being stagnant 3 weeks ago and not stagnant this week corresponds to the
3 3
entries in P20 and P21 . That is, the probability of not being stagnant this week is given by
3 3
P20 + P21 = 0.4675 + 0.37125 = 0.83875. There is an 83.875% chance that the market will
not be stagnant this week given that it was stagnant 3 weeks ago.
3
4 Classification of states
Definition 4.1. State j is accessible from state i if Pijn > 0 for some n 0.
Definition 4.2. Two states, i and j that are accessible to each other are said to commu-
nicate
Theorem 4.1. Communication satisfies the following 3 conditions:
3. If state i communicates with state j, and state j communicates with state k, then state
i communicates with state k.
Proof. The first two properties follow immediately from the definition of communication. To
prove the third property, suppose i communicates with j, and j communicates with k. Thus,
there exist integers m and n such that Pijn > 0, Pjk m
> 0. Now, by the Chapman-Kolmogorov
equations,
X
Pikn+m = Pirn Prk
m
Pijn Pjk
m
>0
r=0
This tells us that k is accessible from i, and similarly we can show that i is accessible from
k. So i and k communicate.
Definition 4.3. Two states that communicate are in the same class
Definition 4.4. A Markov chain is irreducible if there is only one class, that is, if all
states communicate with each other.
Example 4.1. Consider a Markov chain with four states and the transition probability ma-
trix 1 1
2 2
0 0
1 1
0 0
P= 2 2
1 1 1 1
4 4 4 4
0 0 0 1
Even though we can access 0 or 1 from 2, we cannot access 2 from either of these states.
No state is accessible from 3 except itself. Thus, the classes for this Markov chain are {0, 1},
{2}, and {3}.
References
Ross, S. M., 2010: Introduction to Probability Models. Elsevier, 784 pp.