Markov processes (Weight Suspended equally by n Cables)

  • Thread starter CTK
  • Start date
  • Tags
    Cables
In summary: For the next bit, you use terms ri and pij (presumably ##r_i## and ##p_{ij}##) but do not say what they represent. Those are not any standard notation I recognise so I assume they are just used in your course text or notes. You'll need to explain what they represent...OR... just write the matrix elements directly. The element ##q_{ij}## is the instantaneous rate of migration from state ##i## (##i## unbroken cables) to state ##j## (##j## unbroken cables). Since the probability of more than one cable
  • #1
CTK
35
4
Summary:: Markov processes (A weight of L tons is suspended by n cables which share the load equally)

A weight of L tons is suspended by n cables which share the load equally. If k, for 1 ≤ k ≤ n − 1, of the cables have broken, then the remaining (n − k) cables share the load equally. As soon as (n − 1) cables have failed, new cables will be installed instantly to restore the number of cables to n. Let X(t) be the number of unbroken cables at time t. The instantaneous failure rate of a single cable which carries M tons is αM, where α > 0 is a constant. The remaining time to failure of any cable operating at time t is independent of the past history of the process, conditional on X(t).

(a) Specify an appropriate state space for (X(t) : t ≥ 0), and explain why this stochastic process satisfies the Markov property.

(b) Give the generator matrix for (X(t)).

(c) Suppose that individual cables have a failure rate of 0.2 per year per ton of load.

(i) If 4 such cables are used to support 20 tons, find the probability that the system lasts for at least 2 years before reinstalling the cables. Hint: determine the number of states that the process must go through to get to the first reinstallation, and then express the total duration as the sum of the hold times in each of the states to be visited.

(ii) How many cables should be used to ensure with probability 0.999 that the system lasts for at least 2 years before reinstalling the cables?

Note: The sum of n independent Exp(µ) random variables has a Gamma distribution with shape parameter n and rate parameter µ.

My effort/wok so far:

My effort so far:

a-) An appropriate state space is: S = {0,1} where 0 means the cable is broken, and 1 means the cable is unbroken

Now, this stochastic process satisfies a markov property because it independent of the previous history of states, except being conditionally indpendent on the latest state only. That is:

P{X(t+u) = j | H(u), X(u) = i} = P(x(t+u) = j | x(u) = i)

Where it was given that the remaining time to failre of any cable operating at time (t) is independent of he history of the process, conditional on X(t). Thus, our markov process satisfies the markov property.

b-) The generator matrix (Q) has these elements:

qij = {(ri)*(pij) if i not equal to j ; or it is -ri if i = j

and since it is given that the instantaneous failure rate of a single cable which carries M tons is αM, where α > 0 is a constant, then we have r0 = αM
Now, could you please correct any of my mistakes?

Also, I don't know to continue from here? How to derive the values of pij and r1, etc. for Q matrix?

Thank you very much in advance for your help.
 
Physics news on Phys.org
  • #2
Your answer to (a) is incorrect. The state space is the set of all possible values the Markov process can take. Since the process is X(t), the number of unbroken cables, what is the set of all possible values that X(t) can take, given it starts with n unbroken cables?
In answering this, consider whether the system can ever have 0 or 1 unbroken cables, given the statement in the second sentence of your OP. Note the word instantly, meaning the system spends no time having n-1 failed cables, meaning it jumps directly from the state of having n-2 failed cables to one of having no failed cables.
For the next bit, you use terms ri and pij (presumably ##r_i## and ##p_{ij}##) but do not say what they represent. Those are not any standard notation I recognise so I assume they are just used in your course text or notes. You'll need to explain what they represent...
OR
... just write the matrix elements directly. The element ##q_{ij}## is the instantaneous rate of migration from state ##i## (##i## unbroken cables) to state ##j## (##j## unbroken cables). Since the probability of more than one cable breaking at once is zero we know that ##q_{ij}## must be zero if ##j<i-1##. What about for ##j>i##? Hint: there is only one ##ij## combination with ##j>i## for which ##q_{ij}>0##. What is it?
 
  • #3
andrewkirk said:
Your answer to (a) is incorrect. The state space is the set of all possible values the Markov process can take. Since the process is X(t), the number of unbroken cables, what is the set of all possible values that X(t) can take, given it starts with n unbroken cables?
In answering this, consider whether the system can ever have 0 or 1 unbroken cables, given the statement in the second sentence of your OP. Note the word instantly, meaning the system spends no time having n-1 failed cables, meaning it jumps directly from the state of having n-2 failed cables to one of having no failed cables.
For the next bit, you use terms ri and pij (presumably ##r_i## and ##p_{ij}##) but do not say what they represent. Those are not any standard notation I recognise so I assume they are just used in your course text or notes. You'll need to explain what they represent...
OR
... just write the matrix elements directly. The element ##q_{ij}## is the instantaneous rate of migration from state ##i## (##i## unbroken cables) to state ##j## (##j## unbroken cables). Since the probability of more than one cable breaking at once is zero we know that ##q_{ij}## must be zero if ##j<i-1##. What about for ##j>i##? Hint: there is only one ##ij## combination with ##j>i## for which ##q_{ij}>0##. What is it?
Hello Andrewkirk, thanks for your reply.

- So after thinking about what you wrote, would the state space be: S = {2,2,3,...,n-1, n}? The reason there is no 0 and 1 is because as you said, the system can never have 0 or 1 unbroken cables because once we reach n-1 unbroken cables, then it instantly gets restored to n unbroken cables.

- r is the transition rate out of a state, and p[/j] is the transition probability from state i to j, so we have:
qij =ri * pij ; for i not equal to j (i.e., j>i) (sorry, I am no sure how to write ri and p(ij) as you did)
or qij = -r ; for i = j

That's just straight from our lecture notes.

Now my question is, what is the value of r? How to calculate it? Because I need that value before being able to work out the matrix elements.

Thanks for your help, it is much appreciate it.
 
  • #4
From the previous post:
CTK said:
- r[ i] is the transition rate out of a state, and p[ i][/j] is the transition probability from state i to j, so we have:
qij[ i] =ri * pij[ i][ i] ; for i not equal to j (i.e., j>i) (sorry, I am no sure how to write ri and p(ij) as you did)
or qij[ i] = -r[ i] ; for i = j

That's just straight from our lecture notes.

Now my question is, what is the value of r[ i]?
Your previous post was mangled due to the inclusion of bracketed expressions with i. Such expressions get rendered by browsers as the start of italicized expressions.

Here is what I think you intended:
##r_i## is the transition rate out of a state, and ##p_{i, j}##
is the transition probability from state i to j, so we have:
##q_{i, j}[ i] =r_i * p_{i, j}[ i][ i]## ; for i not equal to j (i.e., j>i)
or ##q_{i, j}[ i] = -r[ i]## ; for i = j

I'm not sure what you intended with what I wrote as ## p_{i, j}[ i][ i]##
CTK said:
(sorry, I am no sure how to write ri and p(ij) as you did)
Click on any of my TeX expressions to see what I wrote. Also note that there is a link in the lower left corner to our tutorial on LaTeX.
 
  • #5
Mark44 said:
From the previous post:
Your previous post was mangled due to the inclusion of bracketed expressions with i. Such expressions get rendered by browsers as the start of italicized expressions.

Here is what I think you intended:
##r_i## is the transition rate out of a state, and ##p_{i, j}##
is the transition probability from state i to j, so we have:
##q_{i, j}[ i] =r_i * p_{i, j}[ i][ i]## ; for i not equal to j (i.e., j>i)
or ##q_{i, j}[ i] = -r[ i]## ; for i = j

I'm not sure what you intended with what I wrote as ## p_{i, j}[ i][ i]##

Click on any of my TeX expressions to see what I wrote. Also note that there is a link in the lower left corner to our tutorial on LaTeX.
Ops, sorry I meant to say the following:

##r_i## is the transition rate out of a state, and ##p_{i, j}##
is the transition probability from state i to j, so we have:
##q_{i, j} = r_i * p_{i, j}##
; for i not equal to j (i.e., j>i)
or ##q_{i, j} = -r[ i]## ; for i = j

So just ignore the ## p_{i, j}[ i][ i]##

So having said that, can anyone please help me with correcting my state space if it is wrong, and especially with figuring out the elements of the matrix ##q_{i, j}##?
Thanks for your help.
 

FAQ: Markov processes (Weight Suspended equally by n Cables)

What is a Markov process?

A Markov process is a stochastic process that models the probability of a system transitioning from one state to another over time. It is based on the principle that the future state of the system is only dependent on the current state, and not on any previous states.

How does a Markov process relate to weight suspended equally by n cables?

The concept of weight suspended equally by n cables is often used as an example to explain the principles of a Markov process. In this scenario, the weight represents the system, and the cables represent the possible states that the system can be in. The probability of the weight being held up by each cable is analogous to the transition probabilities in a Markov process.

What are the applications of Markov processes?

Markov processes have a wide range of applications, including finance, biology, physics, and computer science. They are commonly used in modeling systems with random and unpredictable behavior, such as stock prices, population growth, and particle movements.

How do you calculate transition probabilities in a Markov process?

The transition probabilities in a Markov process can be calculated using a transition matrix, which represents the probabilities of transitioning from one state to another. The values in the matrix are determined by analyzing the system and gathering data on the probabilities of transitioning between states.

What are the limitations of Markov processes?

One of the main limitations of Markov processes is that they assume the system is in a steady state and that the probability of transitioning between states remains constant over time. This may not always be the case in real-world systems, making it difficult to accurately model using a Markov process.

Similar threads

Replies
2
Views
2K
Replies
17
Views
1K
Replies
4
Views
2K
Replies
6
Views
3K
3
Replies
98
Views
13K
Back
Top