Markov chains: period of a state

In summary: Yes, I had seen this pattern. I just found kind of quirky because in many places where they give intuition about this definition they say as if you can... but that's not really the case.
  • #1
azay
19
0
Hello,

I am trying to understand the intuition of the definition of the period of a state in a Markov chain.

Say for example we can go from state i to state i in either 4 steps or either 18 steps.
gcd(4,18)=2, with gcd=greatest common divisor. So the period of state i is equal to 2.
I find strange in this case because we cannot revisit state i in every multiple of 2. What's the intuition behind this?

Thanks
 
Physics news on Phys.org
  • #2
azay said:
Hello,

I am trying to understand the intuition of the definition of the period of a state in a Markov chain.

Say for example we can go from state i to state i in either 4 steps or either 18 steps.
gcd(4,18)=2, with gcd=greatest common divisor. So the period of state i is equal to 2.
I find strange in this case because we cannot revisit state i in every multiple of 2. What's the intuition behind this?

Thanks

By definition, a Markov process is where a state depends only on the previous state. Given 1,2, ...,i,...,n-1,n available states the probability of state [tex]S_i[/tex] is dependent only on state [tex]S_{i-1}[/tex]. If reflexivity is allowed then [tex]S_i[/tex] can equal [tex]S_{i-1}[/tex]. The duration of a state is usually irrelevant in the mathematical context. The path taken is dependent on the on the probabilities of each transition.

If the chain is structured in such a way that [tex]k\geq2[/tex] than periods may be defined for some gcd k. k is the minimum interval of states before the process can return to a previous state. However intervals of multiples of k are allowed.
 
Last edited:
  • #3
SW VandeCarr said:
If the chain is structured in such a way that [tex]k\geq2[/tex] than periods may be defined for some gcd k. k is the minimum interval of states before the process can return to a previous state. However intervals of multiples of k are allowed.

But in my example it is not possible to get back to state i in 2 steps, while 2 is the gcd. How does this make '2' the minimum? The minimum is '4'.
 
  • #4
azay said:
But in my example it is not possible to get back to state i in 2 steps, while 2 is the gcd. How does this make '2' the minimum? The minimum is '4'.

A-> B-> A. Two steps (arrows).
 
  • #5
SW VandeCarr said:
A-> B-> A. Two steps (arrows).

It's not possible to go from A to B to A in my example.

Say: A->B->C->D->A (4 steps) is possible and A->E->F->...->A (18 steps) is possible. I don't see how it is possible to get back in 2 steps...

What am I missing?
 
  • #6
azay said:
It's not possible to go from A to B to A in my example.

Say: A->B->C->D->A (4 steps) is possible and A->E->F->...->A (18 steps) is possible. I don't see how it is possible to get back in 2 steps...

What am I missing?

What probabilities are shown in your transition matrix for A->B and A->E? Have you accounted for all other possible transitions in the above scheme? What's the gcd for periods of 4 and 18?
 
  • #7
azay said:
Hello,

I am trying to understand the intuition of the definition of the period of a state in a Markov chain.

Say for example we can go from state i to state i in either 4 steps or either 18 steps.
gcd(4,18)=2, with gcd=greatest common divisor. So the period of state i is equal to 2.
I find strange in this case because we cannot revisit state i in every multiple of 2. What's the intuition behind this?

Thanks

You can't visit it for every multiple of 2, but it's pretty close. For example:

You can visit it in 36 steps by doing it in 18 steps twice
You can also visit it in 38 steps: do an 18 step loop , and then a 4 step loop 5 times.

Once you can visit it in 36 steps you can visit it in 40 steps by doing the 36 step loop and then a 4 step loop, and in 42 times by doing a 38 step loop then a 4 step loop. Similarly you can visit it in 44 steps, 46 steps, 48 steps, 50 steps, etc.
 
  • #8
Office_Shredder said:
You can't visit it for every multiple of 2, but it's pretty close. For example:

You can visit it in 36 steps by doing it in 18 steps twice
You can also visit it in 38 steps: do an 18 step loop , and then a 4 step loop 5 times.

Once you can visit it in 36 steps you can visit it in 40 steps by doing the 36 step loop and then a 4 step loop, and in 42 times by doing a 38 step loop then a 4 step loop. Similarly you can visit it in 44 steps, 46 steps, 48 steps, 50 steps, etc.

Yes, I had seen this pattern. I just found kind of quirky because in many places where they give intuition about this definition they say as if you can visit for every multiple of 2.

But thanks for confirming :).
 
  • #9
SW VandeCarr said:
What probabilities are shown in your transition matrix for A->B and A->E? Have you accounted for all other possible transitions in the above scheme? What's the gcd for periods of 4 and 18?

There are no other possible transitions. Only A->B and A->E are possible. The gcd is 2...
 
  • #10
azay said:
There are no other possible transitions. Only A->B and A->E are possible. The gcd is 2...

Yes, but this is not the typical kind of process that is usually modeled by a Markov chain. Once a system goes from state A to B, it goes back to A with probability one. Likewise, once it goes to E, it goes back to A with probability one. The only divergence (outbranching) occurs between A and B or E. Here you imply probabilities not 0 and not 1, but don't state them. It only matters in terms of which loop the system will inhabit most often. So effectively, if not technically, you have transitions that are reducible to A->B->A and A->E->A, unless you introduce step dependent time factors. Even then, you could load these factors on B and E. In this example, the factors would be 2 and 9 assuming each step gets the same value.
 
Last edited:
  • #11
SW VandeCarr said:
Yes, but this is not the typical kind of process that is usually modeled by a Markov chain. Once a system goes from state A to B, it goes back to A with probability one. Likewise, once it goes to E, it goes back to A with probability one. The only divergence (outbranching) occurs between A and B or E. Here you imply probabilities not 0 and not 1, but don't state them. It only matters in terms of which loop the system will inhabit most often. So effectively, if not technically, you have transitions that are reducible to A->B->A and A->E->A, unless you introduce step dependent time factors. Even then, you could load these factors on B and E. In this example, the factors would be 2 and 9 assuming each step gets the same value.

It was just a theoretical example. Why is it not possible to have probability 1 for all transitions except A->B and A->E? (the sum of the probabilities of A->B and A->E is obviously 1, with both being not equal to 0). There are no other transitions except the ones I listed. It is a valid Markov chain right?
 
  • #12
azay said:
It was just a theoretical example. Why is it not possible to have probability 1 for all transitions except A->B and A->E? (the sum of the probabilities of A->B and A->E is obviously 1, with both being not equal to 0). There are no other transitions except the ones I listed. It is a valid Markov chain right?

Yes.
 
  • #13
SW VandeCarr said:
Yes.

Okay. Thanks for helping me out!
 

Related to Markov chains: period of a state

What is a Markov chain?

A Markov chain is a mathematical model used to describe the transition of a system from one state to another over time. It is a type of stochastic process, meaning that it involves random variables and probabilities.

What is the period of a state in a Markov chain?

The period of a state in a Markov chain is the greatest common divisor (GCD) of the set of all possible numbers of steps it takes to return to that state. It is a measure of the regularity or periodicity of the state's behavior.

How is the period of a state calculated?

The period of a state can be calculated by finding the GCD of all the possible numbers of steps it takes to return to that state. This can be done by listing out all the possible sequences of transitions and identifying the pattern of repetition.

What is the significance of the period of a state in a Markov chain?

The period of a state can provide information about the long-term behavior of a system described by a Markov chain. A state with a period of 1 is considered aperiodic, meaning it will eventually return to itself regardless of the starting position. A state with a period greater than 1 is considered periodic, meaning it will only return to itself after a certain number of steps.

Can the period of a state be greater than the number of states in a Markov chain?

Yes, it is possible for the period of a state to be greater than the number of states in a Markov chain. This can occur when there are multiple paths from one state to another, and the number of steps needed to return to a state varies depending on the path taken.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
438
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
5K
Back
Top