Markov chain chance one state is reached before another

In summary, the problem asks for the chance that state 1 will be reached before state 4 in a given Markov chain with four states and a transition matrix. The solution involves creating a new transition matrix with absorbing states at 1 and 4, and solving a linear system to find the probabilities of reaching state 1 before state 4 from each starting state.
  • #1
kasperrepsak
31
0
Hey could someone explain why this is true? I am trying to understand how to solve such a problem but I don't understand the solution.

Problem:
Given a Markov chain [itex]\left\{X_{n}: n\in\ \mathbb{N}\right\}[/itex] with four states 1,2,3,4 and transition matrix
[itex]P =
\begin{pmatrix}
0 & \frac{1}{2}& \frac{1}{2} & 0 \\
0 & 0 & \frac{1}{2} & \frac{1}{2}\\
\frac{1}{2} & 0 & 0 & \frac{1}{2} \\
\frac{1}{2} & \frac{1}{2} & 0 & 0
\end{pmatrix}[/itex]
We leave from state 3. What is the chance that state 1 will be reached before state 4?

Solution:
Lets call this chance x. Let's call y the same chance but instead leaving from 2. Let's call:
[itex]f =
\begin{pmatrix}
1 \\
x \\
y \\
0
\end{pmatrix}[/itex]
Then f satisfies P'f = f with P' the transition matrix derived from P by making state 1 and 4 absorbing states i.e.,

[itex]
\begin{pmatrix}
1 & 0& 0 & 0 \\
0 & 0 & \frac{1}{2} & \frac{1}{2}\\
\frac{1}{2} & 0 & 0 & \frac{1}{2} \\
0 & 0 & 0 & 1\end{pmatrix}*\begin{pmatrix}
1 \\
x \\
y \\
0
\end{pmatrix}= \begin{pmatrix}
1 \\
x \\
y \\
0
\end{pmatrix}[/itex]
Solving gives y = 1/2, x = 1/4.
 
Last edited:
Physics news on Phys.org
  • #2
kasperrepsak said:
Hey could someone explain why this is true? I am trying to understand how to solve such a problem but I don't understand the solution.

Problem:
Given a Markov chain [itex]\left\{X_{n}: n\in\ \mathbb{N}\right\}[/itex] with four states 1,2,3,4 and transition matrix
[itex]P =
\begin{pmatrix}
0 & \frac{1}{2}& \frac{1}{2} & 0 \\
0 & 0 & \frac{1}{2} & \frac{1}{2}\\
\frac{1}{2} & 0 & 0 & \frac{1}{2} \\
\frac{1}{2} & \frac{1}{2} & 0 & 0
\end{pmatrix}[/itex]
We leave from state 3. What is the chance that state 1 will be reached before state 4?

Solution:
Lets call this chance x. Let's call y the same chance but instead leaving from 2. Let's call:
[itex]f =
\begin{pmatrix}
1 \\
x \\
y \\
0
\end{pmatrix}[/itex]
Then f satisfies P'f = f with P' the transition matrix derived from P by making state 1 and 4 absorbing states i.e.,

[itex]
\begin{pmatrix}
1 & 0& 0 & 0 \\
0 & 0 & \frac{1}{2} & \frac{1}{2}\\
\frac{1}{2} & 0 & 0 & \frac{1}{2} \\
0 & 0 & 1 & 1\end{pmatrix}*\begin{pmatrix}
1 \\
x \\
y \\
0
\end{pmatrix}= \begin{pmatrix}
1 \\
x \\
y \\
0
\end{pmatrix}[/itex]
Solving gives y = 1/2, x = 1/4.

You need to be more specific: what part do you not understand?

What you presented above is pretty standard, and is almost always part of the course material regarding Markov chains.

Also: for context: are you using a textbook and/or course notes? Do these sources not have such material in them?
 
  • #3
If you could direct me to a place where this is explained I would be grateful. The material my teacher provided doesn't address this problem or atleast not directly, so I find it hard to understand this solution from the theory provided. I don't understand why P'f=f must be true. And I don't understand why those states are made absorbing.

Edit: I don't understand why its necessary to multiply with such a vector either.
 
Last edited:
  • #4
kasperrepsak said:
If you could direct me to a place where this is explained I would be grateful. The material my teacher provided doesn't address this problem or atleast not directly, so I find it hard to understand this solution from the theory provided. I don't understand why P'f=f must be true. And I don't understand why those states are made absorbing.

Edit: I don't understand why its necessary to multiply with such a vector either.

When you don't understand something, it is a good idea to work it out from first principles. Let's do that.

Define ##f_i(n)## to be the probability that, starting from state i we reach state 1 for the first time at time n and do not reach state 4 before time n; let ##f_i = \sum_{n=1}^{\infty} f_i(n)##; this is the probability we reach state 1 before reaching state 4, starting from state i.

Note that
[tex] f_i(n) = P\{X_1 \neq 1,4, \: X_2 \neq 1,4,\: \ldots, X_{n-1} \neq 1,4, \: X_n = 4 |X_0=i\} [/tex]
Conditioning on ##X_1## we have
[tex] f_i(n) = \sum_{j \neq 1,4} p_{ij} P\{X_2 \neq 1,4, \: \ldots, X_{n-1} \neq 1,4,\: X_n = 1 |X_1 = j, X_0 = i \} \\
= \sum_{j \neq 1,4} p_{ij} f_j(n-1).[/tex]
Also: ##f_i(1) = p_{i1}##. Summing over n we get
[tex] f_i = f_i(1) + \sum_{n=2}^{\infty} f_i(n)\\
= p_{ij} + \sum_{j \neq 1,4} p_{ij} \sum_{n=2}^{\infty} f_j(n-1)
= p_{i1} + \sum_{j \neq 1,4} p_{ij} f_j .[/tex]

In other words:
[tex] f_2 = p_{21} + p_{22} f_2 + p_{23} f_3 \\
f_3 = p_{31} + p_{32} f_2 + p_{33} f_3.[/tex]
Note that this is a simple 2x2 linear system that we can solve easily.

Of course, since (in this example, but not all examples) we either reach state 1 before state 4 or reach state 4 before state 1, the probability that, starting from state i we reach 4 before 1 is just ## 1 - f_i##.

Note that if we set ##f_1 = 1## and ##f_4 = 0## we can write the equations above as
[tex] f_1 = 1 f_1 + 0 f_2 + 0 f_3 + 0 f_4 \\
f_2 = p_{21} f_1 + p_{22}f_2 + p_{23} f_3 + p_{24}f_4\\
f_3 = p_{31} f_1 + p_{32}f_2 + p_{33}f_3 + p_{34} f_4 \\
f_4 = 0 f_1 + 0 f_2 + 0 f_3 + 1 f_4[/tex]
This is just the set of equations ##f = \tilde{P} f## for a modified Markov chain with transition matrix
[tex]\tilde{P} = \pmatrix{1 &0&0&0\\p_{21}&p_{22} & p_{23} & p_{24}\\
p_{31} & p_{32} & p_{33} & p_{34} \\
0 & 0 & 0 & 1}[/tex]
and with
[tex] f = \pmatrix{1 \\f_2 \\f_3 \\0}[/tex]

As I said: work it out in detail from first principles.
 
  • Like
Likes 1 person
  • #5
Thank you very much for taking your time and explaining this neatly and thoroughly. I'm sure this will be helpful to many others too as I haven't found an explanation of this particular problem trough Google.
 
  • #6
kasperrepsak said:
Thank you very much for taking your time and explaining this neatly and thoroughly. I'm sure this will be helpful to many others too as I haven't found an explanation of this particular problem trough Google.

There was a typo in my post, which I corrected in an "edit" and which was seemingly accepted, but was later not implemented. I should have written ##X_n = 1## instead of ##X_n = 4## n the definition of ##f_i(n)##.
 
  • #7
Ray Vickson said:
There was a typo in my post, which I corrected in an "edit" and which was seemingly accepted, but was later not implemented. I should have written ##X_n = 1## instead of ##X_n = 4## n the definition of ##f_i(n)##.
Yes I realized that. What about the solution? Doesn't it also contain a mistake? If x is the chance leaving from state 3 shouldn't it come in the third place of the vector instead of the second?
 
  • #8
Ray Vickson said:
There was a typo in my post, which I corrected in an "edit" and which was seemingly accepted, but was later not implemented. I should have written ##X_n = 1## instead of ##X_n = 4## n the definition of ##f_i(n)##.
I'm sorry but as I know sometimes replies are overlooked I'm sending another reply with the hope you will respond.
 
  • #9
kasperrepsak said:
I'm sorry but as I know sometimes replies are overlooked I'm sending another reply with the hope you will respond.

If you are asking me to check whether a given pair of numbers is the solution of a given pair of two simple linear equations in two unknowns, then I refuse to answer. You can check it yourself---and you always should, as a matter of habit, in such cases. Just plug in the numbers to see if they work. If they don't work, somebody has made an error---typographical or otherwise---and you need to re-solve those equations.
 
  • #10
Ray Vickson said:
If you are asking me to check whether a given pair of numbers is the solution of a given pair of two simple linear equations in two unknowns, then I refuse to answer. You can check it yourself---and you always should, as a matter of habit, in such cases. Just plug in the numbers to see if they work. If they don't work, somebody has made an error---typographical or otherwise---and you need to re-solve those equations.

Well the pair is a solution to that matrix vector multiplication. But I'm pretty sure the teacher made a mistake in putting the chance x on the second place of the vector, since the way u explained it it should be on the third place. This is because x is the chance leaving from state 3. I just wanted to make sure I understood that correctly.
 

Related to Markov chain chance one state is reached before another

1. What is a Markov chain?

A Markov chain is a mathematical model that describes a sequence of possible events, in which the probability of each event only depends on the state of the previous event. It is used to model random processes, such as the behavior of a system over time.

2. How does a Markov chain calculate the chance of one state being reached before another?

A Markov chain calculates the chance of one state being reached before another by looking at the transition probabilities between the states. These probabilities are represented by a matrix, and the chance of reaching one state before another can be calculated by multiplying the appropriate entries in the matrix together.

3. What is the importance of understanding the chance of one state being reached before another in Markov chains?

Understanding the chance of one state being reached before another in Markov chains is important because it allows us to make predictions about the future behavior of a system. By knowing the likelihood of reaching a certain state, we can better understand and control the behavior of the system.

4. Can the chance of one state being reached before another change over time in a Markov chain?

Yes, the chance of one state being reached before another can change over time in a Markov chain. This can happen if the transition probabilities between states change, or if the initial state of the system changes. In some cases, the chance of one state being reached before another may also converge to a steady state over time.

5. How are Markov chain models used in scientific research?

Markov chain models are used in scientific research to model and analyze complex systems. They can be applied to a wide range of fields, including biology, economics, and physics. Markov chains are also used in data analysis and machine learning, as they can be used to make predictions and classify data based on past observations.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
473
  • Calculus and Beyond Homework Help
Replies
2
Views
439
  • Calculus and Beyond Homework Help
Replies
6
Views
918
  • Calculus and Beyond Homework Help
Replies
2
Views
809
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
946
  • Calculus and Beyond Homework Help
Replies
4
Views
991
  • Calculus and Beyond Homework Help
Replies
12
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
825
Back
Top