[markov chain] prove that probability equals 6/pi²

In summary, a Markov chain is a mathematical model used to describe a sequence of events where the probability of each event depends only on the state of the previous event. Probability in a Markov chain is related to a transition matrix and must add up to 1. Proving that probability equals 6/pi² in a Markov chain is significant in demonstrating the relationship between probability and random processes. The proof is derived using the theory of ergodicity and can be applied to all Markov chains that meet the requirements. However, the specific value of 6/pi² may vary depending on the individual Markov chain.
  • #1
nonequilibrium
1,439
2

Homework Statement


attachment.php?attachmentid=42880&stc=1&d=1327088638.gif


Homework Equations


N/A

The Attempt at a Solution


I'll shortly explain what my reasoning is so far, but please ignore it if it comes across too jumbled:
---
Let P denote the markov matrix associated with this problem, then I think I was able to argue that the probability that is asked for is equal to [itex]1- \sum_{n=1}^{+\infty} P^n(0,0)[/itex] where [itex]P^n(0,0)[/itex] denotes the element in the first row of the first column of the n-th power of the Markov matrix.

And I then wanted to calculate [itex]P^n(0,0)[/itex] for every n by trying to find a pattern in [itex]P^1(0,0)[/itex], [itex]P^2(0,0)[/itex], [itex]P^3(0,0)[/itex], etc. I think I found one: define [itex]t_n = \left(t_{n-1} + \left( \frac{n+1}{n} \right)^2 \prod_{k=1}^{n+1} x_k \right) x_n[/itex] (with [itex]t_0 = x_1[/itex]) with [itex]x_n = p_{n,n-1}[/itex], then I think [itex]\sum_{n=1}^{+\infty} P^n(0,0) = \sum_{n=0}^{+\infty} t_n[/itex]. But it seems near impossible to prove that 1 minus this expression equals [itex]\frac{6}{\pi^2}[/itex] so I'm probably way off track...
---

Any better suggestions? How would you approach this problem instead?
 

Attachments

  • markov.gif
    markov.gif
    6.6 KB · Views: 853
Physics news on Phys.org
  • #2
Just an idea. In some circumstances (if [itex]\|P\|\leq 1[/itex]) we have

[tex]\sum_{n=0}^{+\infty}{P^n}=(I-P)^{-1}[/tex]

So all you need to do know is to invert I-P...
 
  • #3
*tries to invert infinite-dimensional matrix*
 
  • #4
mr. vodka said:

Homework Statement


attachment.php?attachmentid=42880&stc=1&d=1327088638.gif


Homework Equations


N/A

The Attempt at a Solution


I'll shortly explain what my reasoning is so far, but please ignore it if it comes across too jumbled:
---
Let P denote the markov matrix associated with this problem, then I think I was able to argue that the probability that is asked for is equal to [itex]1- \sum_{n=1}^{+\infty} P^n(0,0)[/itex] where [itex]P^n(0,0)[/itex] denotes the element in the first row of the first column of the n-th power of the Markov matrix.

And I then wanted to calculate [itex]P^n(0,0)[/itex] for every n by trying to find a pattern in [itex]P^1(0,0)[/itex], [itex]P^2(0,0)[/itex], [itex]P^3(0,0)[/itex], etc. I think I found one: define [itex]t_n = \left(t_{n-1} + \left( \frac{n+1}{n} \right)^2 \prod_{k=1}^{n+1} x_k \right) x_n[/itex] (with [itex]t_0 = x_1[/itex]) with [itex]x_n = p_{n,n-1}[/itex], then I think [itex]\sum_{n=1}^{+\infty} P^n(0,0) = \sum_{n=0}^{+\infty} t_n[/itex]. But it seems near impossible to prove that 1 minus this expression equals [itex]\frac{6}{\pi^2}[/itex] so I'm probably way off track...
---

Any better suggestions? How would you approach this problem instead?

I found the notation confusing at first, but now I see that what you want to prove is that the chain is transient, with Pr{return to 0} = 1-6/pi^2.

The first order of business is to establish that the chain is transient, with P{return to zero} < 1. Since you have a discrete-time birth-death process, quite a lot is known or knowable about your system. For example, look at the 1995 paper by Van Doorn, freely downloadable from
http://journals.cambridge.org/downl...21a.pdf&code=f73435367827c0b479829c68c457666d
(Google search on "birth-death process+discrete time", and look at the 4th entry "Geometric Ergodicity ... "). In particular, Theorem 2.1 may be a starting point. As for the problem of actually computing the return probability, all I can suggest is that you look at the first-passage-probability equations (for end-state 0) and try to solve them, perhaps using z-transform techniques, or something similar.

Good luck.

RGV
 
Last edited:

Related to [markov chain] prove that probability equals 6/pi²

1. What is a Markov chain?

A Markov chain is a mathematical model used to describe a sequence of events where the probability of each event depends only on the state of the previous event. It is often used to model random processes such as stock prices, weather patterns, and game outcomes.

2. How is probability related to Markov chains?

In a Markov chain, the probability of transitioning from one state to another is defined by a transition matrix. This matrix represents the probabilities of moving from one state to another, and these probabilities must add up to 1. Therefore, the sum of all probabilities in a Markov chain is always equal to 1.

3. What is the significance of proving that probability equals 6/pi² in a Markov chain?

Proving that probability equals 6/pi² in a Markov chain is significant because it demonstrates the fundamental relationship between probability and random processes. It also helps to validate the accuracy of the model and its predictions.

4. How is the proof of probability equaling 6/pi² in a Markov chain derived?

The proof of probability equaling 6/pi² in a Markov chain is derived using the theory of ergodicity, which states that a system will eventually reach a steady state where the probability of being in any state is equal. This allows us to calculate the probability of being in a specific state over a long period of time, which leads to the result of 6/pi².

5. Can this proof be applied to all Markov chains?

Yes, this proof can be applied to all Markov chains as long as they meet the requirements of ergodicity and have a finite number of states. However, the specific value of 6/pi² may vary depending on the transition matrix and initial state probabilities of each individual Markov chain.

Similar threads

Replies
6
Views
713
Replies
1
Views
921
Replies
12
Views
1K
Replies
3
Views
920
Replies
3
Views
266
Replies
2
Views
1K
Replies
1
Views
553
Replies
9
Views
941
Back
Top