You are on page 1of 10

ESI 4313: Operations Research 2

Ruiwei Jiang
Fall 2012
Homework 3 Solution

Problem #1 (15pt)
A shop has two identical machines that are operated continuously except when they break down.
Suppose that if the machine is up on day n, it is up on the (n + 1)st day with probability 0.9,
independent of the past. On the other hand, if it is down on the nth day, it stays down on the
(n + 1)st day with probability 0.2, also independent of the past. Define Xn as the number of
machines that are up on day n.
1. (2pt) What is the state space S of stochastic process {Xn , n 0}?
2. (4pt) Explain why {Xn , n 0} is a Markov chain.
3. (3pt+3pt) Write down the transition matrix of {Xn , n 0}, and draw the transition diagram
of {Xn , n 0}.
4. (3pt) Is {Xn , n 0} ergodic and why?
Solution:
1. The sample space S = {0, 1, 2}, since there an only be 0, 1, or 2 operating machines.
2. According to the definition, {Xn , n 0} is a Markov chain because (i) it is a discrete-time
stochastic process, and (ii) it satisfies the Markovian property, i.e., the distribution of Xn+1
only depends on Xn .
3. To obtain the transition matrix, we discuss the following cases: (note: the following detailed
discussions are only for reference, which you can skip for simplicity once if you can correctly
obtain the transition matrix.)
(a) P(Xn+1 = 0|Xn = 0) = P(both machines stay down) = (0.2)(0.2) = 0.04.
(b) P(Xn+1 = 1|Xn = 0) = P(one machine starts up, one mathine stays down) = (2)(0.2)(1
0.2) = 0.32.
(c) P(Xn+1 = 2|Xn = 0) = P(both machines starts up) = (1 0.2)(1 0.2) = 0.64.
(d) P(Xn+1 = 0|Xn = 1) = P(the up machine shuts down, the down mathine stays down) =
(1 0.9)(0.2) = 0.02.
(e) P(Xn+1 = 1|Xn = 1) = P(both machines stays) + P(the up machine shuts down and
the down machine starts up) = (0.2)(0.9) + (1 0.2)(1 0.9) = 0.26.
(f) P(Xn+1 = 2|Xn = 1) = P(the up machine stays up, the down machine starts up) =
(0.9)(1 0.2) = 0.72.
(g) P(Xn+1 = 0|Xn = 2) = P(both machines shut down) = (1 0.9)(1 0.9) = 0.01.
(h) P(Xn+1 = 1|Xn = 2) = P(one machine stays up, one mathine shuts down) = (2)(0.9)(1
0.9) = 0.18.
1

(i) P(Xn+1 = 2|Xn = 2) = P(both machines stay up) = (0.9)(0.9) = 0.81.


To sum up, the transition matrix is as follows:

0.04 0.32 0.64


P = 0.02 0.26 0.72 .
0.01 0.18 0.81
Accordingly, the transition diagram is shown in Figure 1.
0.01

0.26

0.02
0.04

0.18

1
0.32

0.81

0.72

0.64

Figure 1: Transition diagram for problem #1


4. {Xn , n 0} is ergodic because (i) it is irreducible since every pair of states of {Xn , n 0}
communicate, and (ii) all the states of {Xn , n 0} are recurrent and aperiodic, and hence
ergodic.

Problem #2 (15pt)
Daves Photography Store stocks a particular model camera that can be ordered weekly. The
demand for this camera is random. It is 0 with probability 0.35, 1 with probability 0.40, 2 with
probability 0.20, and 3 with probability 0.05. At the end of the week, Dave places an order that is
delivered in time for the next opening of the store on Monday. Daves current policy is to order 2
cameras if inventory is empty and order nothing otherwise. Let Xn be the number of cameras in
inventory at the end of week n.
1. (2pt) What is the state space S of stochastic process {Xn , n 0}?
2. (2pt) Explain why {Xn , n 0} is a Markov chain.
3. (4pt+4pt) Write down the transition matrix of {Xn , n 0}, and draw the transition diagram
of {Xn , n 0}.
4. (3pt) At the beginning of week 1, Dave estimates that there is a 60% chance that his inventory
will be empty at the end of this week and a 40% chance that his inventory will have only 1
camera. What is the PMF for X3 ?
2

Solution:
1. The state space S = {0, 1, 2} because Xn can only be 0, 1, or 2 due to the ordering policy.
2. {Xn , n 0} is a Markov chain because (i) it is a discrete-time stochastic process, and (ii) it
has the Markovian property, i.e., the distribution of Xn+1 only depends on Xn .
3. Let Dn+1 represent the demand of week n + 1. To obtain the transition matrix, we discuss the
following cases: (note: the following detailed discussions are only for reference, which you can
skip for simplicity once if you can correctly obtain the transition matrix.)
(a) P(Xn+1 = 0|Xn = 0) = P(Dn+1 2) = 0.20 + 0.05 = 0.25.
(b) P(Xn+1 = 1|Xn = 0) = P(Dn+1 = 1) = 0.40.
(c) P(Xn+1 = 2|Xn = 0) = P(Dn+1 = 0) = 0.35.
(d) P(Xn+1 = 0|Xn = 1) = P(Dn+1 1) = 0.65.
(e) P(Xn+1 = 1|Xn = 1) = P(Dn+1 = 0) = 0.35.
(f) P(Xn+1 = 2|Xn = 1) = P(Dn+1 = 1) = 0.
(g) P(Xn+1 = 0|Xn = 2) = P(Dn+1 2) = 0.20 + 0.05 = 0.25.
(h) P(Xn+1 = 1|Xn = 2) = P(Dn+1 = 1) = 0.40.
(i) P(Xn+1 = 2|Xn = 2) = P(Dn+1 = 0) = 0.35.
To sum up, the transition matrix is as follows:

0.25 0.40 0.35


P = 0.65 0.35 0.00 .
0.25 0.40 0.35
Accordingly, the transition diagram is shown in Figure 2.
0.25

0.35

0.65
0.25

0.40

0.35

0.40

0.35

Figure 2: Transition diagram for problem #2


4. The PMF of X3 is
a(3)

2
0.25
0.40
0.35




= a(1) P 2 = 0.60 0.40 0 0.65 0.35 0.00 = 0.402 0.381 0.217 .
0.25 0.40 0.35

That is, P(X3 = 0) = 0.402, P(X3 = 1) = 0.381, and P(X3 = 2) = 0.217.


3

Problem #3 (12pt)
An organization has N employees where N is a large number. Each employee has one of three
possible job classifications and changes classifications (independently) every week according to a
Markov chain. Specifically,
If one is in class 1, then she stays in class 1 with prob. 0.7, moves up to class 2 with prob.
0.2, and moves up to class 3 with prob. 0.1.
If one is in class 2, then she moves down to class 1 with prob. 0.2, stays in class 2 with prob.
0.6, and moves up to class 3 with prob. 0.2.
If one is in class 3, then she moves down to class 1 with prob. 0.1, moves down to class 2
with prob. 0.4, stays in class 3 with prob. 0.5.
Find
1. (3pt) Suppose one is in class 1 in week 1, what is the probability that she is in class 3 in week
3?
2. (5pt) What percentage of employees are in each classification?
3. (4pt) Suppose that the week salary if $1000 for class 1 employees, $1500 for class 2 employees,
and $2000 for class 3 employees. What is the average amount of week salaries of all employees?
Solution:
1. Let Xn represent the ones class in week n. Then
P(X3 = 3|X1 = 1) =
=

3
X
i=1
3
X

P(X3 = 3, X2 = i|X1 = 1)
P(X3 = 3|X2 = i)P(X2 = i|X1 = 1)

i=1

= (0.7)(0.1) + (0.2)(0.2) + (0.1)(0.5)


= 0.16.
2. {Xn , n 0} is a Markov chain with transition

0.7
P = 0.2
0.1

matrix

0.2 0.1
0.6 0.2 .
0.4 0.5

By the steady-state equation, we have




= P
eT = 1

Solving this equation gives us 1 =

1 = 0.71 + 0.22 + 0.13

2 = 0.21 + 0.62 + 0.43

= 0.11 + 0.22 + 0.53

3
1 + 2 + 3 = 1
6
17 ,

2 =

7
17 ,

3 =

4
17 .

3. The average amount of week salaries is 10001 + 15002 + 20003 = $1441.2.

Problem #4 (15pt)
A machine operates continuously except when it breaks down. Suppose that the state of the
machine on day n depends only on its states in days n 1 and n 2. In particular, the probability
of being up on day n is
0.4 if it was up on days n 1 and n 2.
0.7 if it was up on day n 1 and down on day n 2.
0.2 if it was down on day n 1 and up on day n 2.
0.8 if it was down on days n 1 and n 2.
Let Xn be the state of the machine on day n, i.e., Xn = 1 if the machine is up and Xn = 0 if the
machine is down.
1. (2pt) Is {Xn , n 0} a Markov chain and why? If yes, write down its transition matrix and
transition diagram.
2. (2pt+3pt+3pt) Is {(Xn , Xn+1 ), n 0} a Markov chain and why? If yes, write down its transition
matrix and transition diagram.
3. (5pt) Suppose that X0 = X1 = 1. Compute the probability P(X1 = 1, X2 = 1, X4 = 0).
Solution:
1. {Xn , n 0} is not a Markov chain because it does not satisfy the Markovian property.
2. {(Xn , Xn+1 ), n 0} is a Markov chain because it is a discrete-time stochastic process satisfying
the Markovian property, i.e., the distribution of (Xn , Xn+1 ) only depends on (Xn1 , Xn ). To
obtain the transition matrix, we discuss the following cases: (note: the following detailed discussions are only for reference, which you can skip for simplicity once if you can correctly obtain
the transition matrix.)
(a) P(Xn+1 = 1, Xn = 1|Xn = 1, Xn1 = 1) = P(Xn+1 = 1|Xn = 1, Xn1 = 1) = 0.4.
(b) P(Xn+1 = 0, Xn = 1|Xn = 1, Xn1 = 1) = 1 P(Xn+1 = 1|Xn = 1, Xn1 = 1) = 0.6.
(c) P(Xn+1 = 1, Xn = 1|Xn = 1, Xn1 = 0) = P(Xn+1 = 1|Xn = 1, Xn1 = 0) = 0.7.
(d) P(Xn+1 = 0, Xn = 1|Xn = 1, Xn1 = 0) = 1 P(Xn+1 = 1|Xn = 1, Xn1 = 0) = 0.3.
(e) P(Xn+1 = 1, Xn = 0|Xn = 0, Xn1 = 1) = P(Xn+1 = 1|Xn = 0, Xn1 = 1) = 0.2.
(f) P(Xn+1 = 0, Xn = 0|Xn = 0, Xn1 = 1y) = 1 P(Xn+1 = 1|Xn = 0, Xn1 = 1) = 0.8.
(g) P(Xn+1 = 1, Xn = 0|Xn = 0, Xn1 = 0) = P(Xn+1 = 1|Xn = 0, Xn1 = 0) = 0.8.
(h) P(Xn+1 = 0, Xn = 0|Xn = 0, Xn1 = 0) = P(Xn+1 = 0|Xn = 0, Xn1 = 0) = 0.2.
5

All the other conditional probabilities are zeros. To sum up, the transition matrix is as follows:

0.4 0 0.6 0
0.7 0 0.3 0

P =
0 0.2 0 0.8 .
0 0.8 0 0.2
Accordingly, the transition diagram is shown in Figure 3.
0.8

0.7

0.4

0.2

1,1

0,1

1,0
0.3

0,0

0.2

0.8

0.6

Figure 3: Transition diagram for problem #4


3. We have
P(X1 = 1, X2 = 1, X4 = 0) =

1
X

P(X1 = 1, X2 = 1, X3 = i, X4 = 0)

i=0

1
X

P(X0 = 1, X1 = 1) P(X1 = 1, X2 = 1|X0 = 1, X1 = 1)

i=0

P(X2 = 1, X3 = i|X1 = 1, X2 = 1) P(X3 = i, X4 = 0|X2 = 1, X3 = i)


=

1
X

P(X1 = 1, X2 = 1|X0 = 1, X1 = 1)

i=0

P(X2 = 1, X3 = i|X1 = 1, X2 = 1) P(X3 = i, X4 = 0|X2 = 1, X3 = i)


= 0.4 0.6 0.8 + 0.4 0.4 0.6
= 0.288.

Problem #5 (12pt)
A Markov chain {Xn , n 0} with sates 0, 1, 2, has the transition probability matrix

1
1
2 a 6

P = 0 31 b .
1
2 0 c
6

1. (3pt) Decide values a, b, and c.


2. (4pt) If P(X0 = 0) = P(X0 = 1) = 14 , find E[X3 ].
3. (5pt) Compute the steady-state probabilities of this Markov chain.
Solution:
1. Due to the law of total probability, we have
b = 23 and c = 12 .

1
2

+a+

1
6

= 1, and hence a = 31 . Similarly, we have

2. Since P(X0 = 0) = P(X0 = 1) = 41 , we have P(X0 = 2) = 1 P(X0 = 0) P(X0 = 1) = 12 .


Hence, the PMF of X3 is
1 1 1
a

(3)

(0)

=a

P =

1
4

1
4

1
2

 2
0
1
2

3
1
3

6
2
3
1
2



= 0.41 0.20 0.39 .

That is, P(X3 = 0) = 0.41, P(X3 = 1) = 0.20, and P(X3 = 3) = 0.39. Hence,
E[X3 ] = (0)P(X3 = 0) + (1)P(X3 = 1) + (2)P(X3 = 2) = 0.98.
3. By the steady-state equation, we have

= P
eT = 1

1 = 12 1 + 21 3

= 1 + 1
2
3 1
3 2

3 = 6 1 + 32 2 + 12 3

1 + 2 + 3 = 1

Solving this equation gives us 1 = 25 , 2 = 15 , 3 = 25 .

Problem #6 (16pt)
Coin 1 has probability 0.6 of coming up heads, and coin 2 has probability 0.5 of coming up heads.
Suppose that we flip a coin every minute. If the coin comes up heads, we will flip this coin again
the next minute. If the coin comes up tails, we will put this coin aside and flip the other coin the
next minute.
1. (5pt) Define a Markov chain with 2 states which will help us to determine the proportion of
time that we flip coin 1.
2. (3pt+2pt) Write down the transition matrix and draw the transition diagram of this Markov
chain.
3. (4pt) What proportion of flips use coin 1?

4. (2pt) If we start the process with coin 1, what is the probability that coin 2 is used on the
fifth flip?
Solution:
1. Let Xn represent which coin we use in minute n, such that Xn = i represents that we use coin
i for i = 1, 2. {Xn , n 0} is a discrete-time stochastic process, and it has the Markovian
property because the distribution of Xn+1 only depends on Xn . Hence, {Xn , n 0} is a
Markov chain.
2. Let Yi represents the result of flipping coin i for i = 1, 2, i.e., Yi = H or Yi = T . To obtain
the transition matrix, we discuss the following cases: (note: the following detailed discussions
are only for reference, which you can skip for simplicity once if you can correctly obtain the
transition matrix.)
(a) P(Xn+1 = 1|Xn = 1) = P(Y1 = H) = 0.6.
(b) P(Xn+1 = 2|Xn = 1) = P(Y1 = T ) = 0.4.
(c) P(Xn+1 = 1|Xn = 2) = P(Y2 = T ) = 0.5.
(d) P(Xn+1 = 2|Xn = 2) = P(Y2 = H) = 0.5.
To sum up, the transition matrix is as follows:


0.6 0.4
.
P =
0.5 0.5
Accordingly, the transition diagram is shown in Figure 4.
0.5

0.6

0.5

0.4

Figure 4: Transition diagram for problem #6


3. By the steady-state equation, we have


= P
eT = 1

1 = 0.61 + 0.52
2 = 0.41 + 0.52

1 + 2 = 1

Solving this equation gives us 1 = 59 , 2 = 49 .


4. The 4-step transition matrix is


0.5556 0.4444
P =
.
0.5555 0.4445
4

Hence, we have
P(X5 = 2|X1 = 1) = P 4


12

= 0.4444.

(Note that the transition matrix P 4 already converges after 4 transitions, and hence another
way of obtaining P(X5 = 2|X1 = 1) is by setting P(X5 = 2|X1 = 1) = 2 = 49 = 0.4444.)
8

Problem #7 (15pt)
An individual possesses 2 umbrellas which he employs in going from his home to office, and vice
versa. If he is at home (the office) at the beginning (end) of a day and it is raining, then he will
take an umbrella with him to the office (home), provided there is one to be taken. If it is raining
and he has no umbrellas to be taken, then he gets wet. If it is not raining, then he never takes
an umbrella. Assume that, independent of the past, it rains at the beginning (end) of a day with
probability 0.5.
1. (6pt) Define a Markov chain with 3 states which will help us to determine the proportion of
time that our man gets wet.
2. (5pt) Show that the steady-state probabilities are given by
(
i =

1
5,

if i = 0,

2
5,

if i = 1, 2.

3. (4pt) What fraction of time does our man get wet?


Solution:
1. Let Xn represent the number of umbrellas the man has at home at the beginning of day n,
then Xn can only be 0, 1, or 2. {Xn , n 0} is Markov chain because (i) it is a discrete-time
stochastic process, and (ii) it has the Markovian property, i.e., the distribution of Xn+1 depends
only on Xn .
2. We first write down the transition matrix for {Xn , n 0}. To this end, we let RB and RE
represent if it rains at the beginning and end of a day, respectively. That is,


1, if it rains at the end of a day
1, if it rains at the beginning of a day
.
, RE =
RB =
0, o.w.
0, o.w.
We discuss the following cases: (note: the following detailed discussions are only for reference,
which you can skip for simplicity once if you can correctly obtain the transition matrix.)
(a) P(Xn+1 = 0|Xn = 0) = P(RE = 0) = 0.5.
(b) P(Xn+1 = 1|Xn = 0) = P(RE = 1) = 0.5.
(c) P(Xn+1 = 2|Xn = 0) = 0.
(d) P(Xn+1 = 0|Xn = 1) = P(RB = 1, RE = 0) = 0.25.
(e) P(Xn+1 = 1|Xn = 1) = P(RB = 1, RE = 1) + P(RB = 0, RE = 0) = 0.5.
(f) P(Xn+1 = 2|Xn = 1) = P(RB = 0, RE = 1) = 0.25.
(g) P(Xn+1 = 0|Xn = 2) = 0.
(h) P(Xn+1 = 1|Xn = 2) = P(RB = 1, RE = 0) = 0.25.
(i) P(Xn+1 = 2|Xn = 2) = P(RB = 0) + P(RB = 1, RE = 1) = 0.75.

To sum up, the transition matrix is as follows:

0.5 0.5
0
P = 0.25 0.5 0.25 .
0
0.25 0.75
By the steady-state equation, we have


0 = 0.50 + 0.251

= 0.5 + 0.5 + 0.25


1
0
1
2

2 = 0.251 + 0.752

0 + 1 + 2 = 1

= P
eT = 1

Solving this equation gives us 0 = 15 , 1 = 25 , 2 = 25 .


3. There are two cases when the man gets wet, (i) Xn = 0 and it rains at the beginning of the
day, and (ii) Xn = 2, it does not rain at the beginning of the day, and it rains at the end of the
day. Hence, the probability that the man gets wet is 0 P(RB = 1) + 2 P(RB = 0, RE = 1) =
0.50 + 0.252 = 0.2.

Problem #8
Let the transition probability matrix of a two-state Markov chain be given as


p
1p
P =
.
1p
p
Show by mathematical induction that
"1
P

(n)

2
1
2

+ 21 (2p 1)n
21 (2p 1)n

1
2
1
2

12 (2p 1)n

+ 12 (2p 1)n

Solution:
We prove by mathematical induction as follows.
Base case: when n = 1, 12 (2p 1)n = p 12 , and the claim holds.
Induction case: Assume that the claim holds for P (n) , we prove that it still holds for P (n+1) . To
that end, we have
P (n+1) = P P (n)
#

 "1 1
n 1 1 (2p 1)n
p
1p
2 + 2 (2p 1)
2
2
=
1
1
n 1 + 1 (2p 1)n
1p
p
2 2 (2p 1)
2
"1 1
#2
1
1
n+1
n+1
2 + 2 (2p 1)
2 2 (2p 1)
=
.
1
1
1
1
n+1
n+1
2 2 (2p 1)
2 + 2 (2p 1)
Hence, we have proved that the claim holds for P (n+1) as well.
By both cases, we have proved the claim by mathematical induction.
10

You might also like