How to solve the markov transition matrix

In summary, the homework statement is trying to find a solution for a problem where the network must be working for a node to be up. The equations they use to solve for the availability of the node are linear and recursive.
  • #1
allamar2012
2
0

Homework Statement


I design a markov model to study the availability of a node which will not work unless the network work: I make the following matrix. In which (i,j) means i is node up /down and j is network up/down. R1,R2 means the repair rate for node and network. F1,F2 means the failure rate for node and network.
(0,0) (1,0) (0,1) (1,1)
(0,0) 0 R1 R2 0
(1,0) F1 0 0 R2
(0,1) F2 0 0 R1
(1,1) 0 F2 F1 0

Homework Equations



I do the following equations to detemine the P(1,1)
(R1+R2)P(0,0)= F1 P(1,0) + F2 P(0,1)
(F1+R2)P(1,0) = R1 P(0,0) +F1 P(1,1)
(F2+R1)P(0,1) = R2 P(0,0) +F1 P(1,1)
(F2+F1)P(1,1) = R2 P(1,0) +R1 P(0,1)

P(0,0) + P(1,0) + P(0,1) + P(1,1) = 1

I do this symbolically in which it gives me long recursive and I couldn't complete it.

The Attempt at a Solution


is my equation right or wrong?
I need advise in how to solve this equations to get P(1,1).
I try MATLAB to calculate this equations as linear system
it gives me
[ 0
0
0
0 ]

Thanks
 
Physics news on Phys.org
  • #2
allamar2012 said:

Homework Statement


I design a markov model to study the availability of a node which will not work unless the network work: I make the following matrix. In which (i,j) means i is node up /down and j is network up/down. R1,R2 means the repair rate for node and network. F1,F2 means the failure rate for node and network.
(0,0) (1,0) (0,1) (1,1)
(0,0) 0 R1 R2 0
(1,0) F1 0 0 R2
(0,1) F2 0 0 R1
(1,1) 0 F2 F1 0

Homework Equations



I do the following equations to detemine the P(1,1)
(R1+R2)P(0,0)= F1 P(1,0) + F2 P(0,1)
(F1+R2)P(1,0) = R1 P(0,0) +F1 P(1,1)
(F2+R1)P(0,1) = R2 P(0,0) +F1 P(1,1)
(F2+F1)P(1,1) = R2 P(1,0) +R1 P(0,1)

P(0,0) + P(1,0) + P(0,1) + P(1,1) = 1

I do this symbolically in which it gives me long recursive and I couldn't complete it.

The Attempt at a Solution


is my equation right or wrong?
I need advise in how to solve this equations to get P(1,1).
I try MATLAB to calculate this equations as linear system
it gives me
[ 0
0
0
0 ]

Thanks

The first 4 equations have a redundancy: given that any three of them hold, the 4th one also holds. So, keep any three of the first four equations, and append the normalization equation P(0,0) + ... + P(1,1) = 1. That 4x4 linear system does have a unique solution.

It is not too bad, just lengthy. You can get P(0,0) = [F1 P(1,0) + F2 P(0,1)]/(R1+R2) from the first equation. You can get P(1,1) = [R2 P(1,0) +R1 P(0,1)]/(F2+F1) from the fourth equation. Now you can substitute these expressions for P(0,0) and P(1,1) into the second (or third) equation and into the sum(P) = 1 equation to get two equations in the two unknowns P(0,1) and P(1,0). It is messy and time-consuming, but is nevertheless straightforward.

RGV
 
  • #3
Hi Ray :)
Thank you very much, it's helpful.
 

Related to How to solve the markov transition matrix

1. What is a Markov transition matrix?

A Markov transition matrix is a mathematical tool used to describe the dynamics of a system where the future state depends only on the current state and not on the previous states. It is often used in the field of probability and statistics to model various processes such as weather patterns, stock market trends, and population growth.

2. How do you construct a Markov transition matrix?

To construct a Markov transition matrix, you first need to define the states of the system and the probabilities of transitioning from one state to another. The sum of probabilities in each row of the matrix should equal to 1. Once all the probabilities are determined, they can be arranged in a square matrix with each row representing the probabilities of transitioning from one state to all other states.

3. How is a Markov transition matrix used to make predictions?

A Markov transition matrix can be used to make predictions about the future state of a system by multiplying the current state vector with the transition matrix. This will give the probabilities of the system being in each state in the next time step. By repeating this process, we can make predictions for multiple time steps into the future.

4. Can a Markov transition matrix have negative or decimal values?

No, a Markov transition matrix can only have non-negative values that represent probabilities. Negative or decimal values would violate the fundamental properties of a probability distribution and make the matrix invalid.

5. How can you check if a Markov transition matrix is valid?

To check if a Markov transition matrix is valid, you can perform a few tests. First, the sum of probabilities in each row should equal to 1. Second, all the values in the matrix should be non-negative. Third, the matrix should be square with equal number of rows and columns. If all these conditions are met, the matrix is considered valid.

Similar threads

Back
Top