Professional Documents
Culture Documents
Markov Chains
Topics
• State-transition matrix
• Network diagrams
• Examples: gambler’s ruin, brand switching,
IRS, craps
• Transient probabilities
• Steady-state probabilities
Discrete – Time Markov Chains
Many real-world systems contain uncertainty and
evolve over time.
Time: n = 0, 1, 2, . . .
State: v-dimensional vector, s = (s1, s2, . . . , sv)
In general, there are m states,
s1, s2, . . . , sm or s0, s1, . . . , sm-1
Also, Xn takes one of m values, so Xn s.
Gambler’s Ruin
At time 0 I have X0 = $2, and each day I make a $1 bet.
I win with probability p and lose with probability 1– p.
I’ll quit if I ever obtain $4 or if I lose all my money.
State space is S = { 0, 1, 2, 3, 4 }
3 with probabilty p
So, X 1
1 with probabilty 1 p
If Xn = 4, then Xn+1 = Xn+2 = ••• = 4.
0 1 2 3 4
0 1 0 0 0 0
1 1-p 0 p 0 0
2 0 1-p 0 p 0
3 0 0 1-p 0 p
4 0 0 0 0 1
Computer Repair Example
• Two aging computers are used for word processing.
• When both are working in morning, there is a 30% chance that one
will fail by the evening and a 10% chance that both will fail.
• Computers that fail during the day are picked up the following
morning, repaired, and then returned the next morning.
(0.1) (0.3)
For computer repair example: (1) (0.8)
2 1
(0.2)
Procedure for Setting Up a DTMC
0 1 2 3 4
0.6 0.3 0 0.1 0 0
0 0 0.8 0 0.2 1
P 0.8 0.2 0 0 0 2
0 0 0 0 1 3
0 1 0 0 0 4
(i)
90 7 3
1 0.90 0.07 0.03
100 100 100
5 205 40
2 0.02 0.82 0.16
250 250 250
30 18 102
3 0.20 0.12 0.68
150 150 150
Steady
state
Markov Analysis
• State variable, Xn = brand purchased in week n
• {Xn} represents a discrete state and discrete time stochastic
process, where S = {1, 2, 3} and N = {0, 1, 2, . . .}.
• If {Xn} has Markovian property and P is stationary, then a
Markov chain should be a reasonable representation of aggregate
consumer brand switching behavior.
Potential Studies
- Predict market shares at specific future points in time.
- Assess rates of change in market shares over time.
- Predict market share equilibriums (if they exist).
- Evaluate the process for introducing new products.
Transform a Process to a Markov Chain
Sometimes a non-Markovian stochastic process can
be transformed into a Markov chain by expanding
the state space.
Example: Suppose that the chance of rain tomorrow
depends on the weather conditions for the previous two
days (yesterday and today).
Specifically,
Pr{ rain tomorrowrain last 2 days (RR) } = 0.7
Pr{ rain tomorrowrain today but not yesterday (NR) } = 0.5
Pr{ rain tomorrowrain yesterday but not today (RN) } = 0.4
Pr{ rain tomorrowno rain in last 2 days (NN) } = 0.2
0.6 0.4
Transition matrix P
0.5 0.5
0.6 0.4
1 0.5 0.5
0.56 0.44
2 0.55 0.45
0.556 0.444
3 0.555 0.445
0.5556 0.4444
4 0.5555 0.4445
0.55556 0.44444
5 0.55555 0.44445
Gambler’s Ruin Revisited for p = 0.75
State-transition network
p p
1p p
0 1 2 3 4
1p 1p
State-transition matrix
0 1 2 3 4
0 1 0 0 0 0
1 0.25 0 0.75 0 0
2 0 0.25 0 0.75 0
3 0 0 0.25 0 0.75
4 0 0 0 0 1
Gambler’s Ruin with p = 0.75, n = 30
0 1 2 3 4
0 1 0 0 0 0
1 0.325 e 0 e 0.675
P(30) = 2 0.1 0 e 0 0.9
3 0.025 e 0 e 0.975
4 0 0 0 0 1
Limiting probabilities
A B C D E F G H I J K
1 Ab sor b in g St at e An aly sis 2 ab so r b in g st at e classes
2 Typ e: DTMC 3 t r an sien t st at es
3 Tit le:
Gam b ler _Ru in
4 Mat r ix sh o w s lo n g t er m t r an sit io n p r o b ab ilit ies f r o m t r an sien t t o ab so r b in g st a
5 Mat r ix Class-1 Class-2
6 St at e 0 St at e 4
7 Tr an sien t St at e 1 0.325 0.675
8 Tr an sien t St at e 2 0.1 0.9
9 Tr an sien t St at e 3 0.025 0.975
10
Conditional vs. Unconditional Probabilities
Pr{Xn = j | X0 = i} = pij(n)
π1 + π2 + π2 = 1, π1 0, π2 0, π3 0
Steady-State Equations for Brand
Switching Example
π1 = 0.90π1 + 0.02π2 + 0.20π3
π2 = 0.07π1 + 0.82π2 + 0.12π3
Total of 4 equations in
π3 = 0.03π1 + 0.16π2 + 0.68π3 3 unknowns
π1 + π2 + π3 = 1
π1 0, π2 0, π3 0
0.8 0 0.2
P 0.4 0.3 0.3
1 2
0 0.9 0.1
3
Start
Win Lose
P4 P5 P6 P8 P9 P10
Continue
Game of Craps Network
not (4,7) not (5,7) not (6,7) not (8,7) not (9,7) not (10,7)
P4 P5 P6 P8 P9 P10
5 6 8
4 9 7 7 7
7
5 6 8 9 7
Win 10 7
4 10 Lose
(7, 11) Start (2, 3, 12)
Game of Craps
Sum 2 3 4 5 6 7 8 9 10 11 12
Prob. 0.028 0.056 0.083 0.111 0.139 0.167 0.139 0.111 0.083 0.056 0.028