Statistics - Discrete Markov Chains

In summary, the student is confused about how to find P(X_{n+2}=0|X_{n}=2) because they don't know what state n+1 is. They believe they solved the problem correctly, but did not. They are waiting for other students to help them with questions about part (b).
  • #1
GreenPrint
1,196
0

Homework Statement





Homework Equations





The Attempt at a Solution



I'm not really sure if this belongs here or in the precalculus mathematics section. I had to take calculus before taking this class so I'm putting it here.

I'm confused about part (b). I don't really understand how I'm supposed to find [itex]P(X_{n+2}=0|X_{n}=2)[/itex] because I don't know what state n+1 is. Thanks for any help.
 
Last edited by a moderator:
Physics news on Phys.org
  • #3
[itex]P(X_{n+2}=0) = P(X_{n+2}=0|X_{n}=2)P(X_{n}=2) + P(X_{n+2}=0|X_{n}=1)P(X_{n}=1) + P(X_{n+2}=0|X_{n}=0)P(X_{n}=0)[/itex]
[itex]P(X_{n+2}=0|X_{n}=2)P(X_{n}=2) = P(X_{n+2}=0) - P(X_{n+2}=0|X_{n}=1)P(X_{n}=1) - P(X_{n+2}=0|X_{n}=0)P(X_{n}=0)[/itex]

I'm not sure how this helps.
 
  • #4
You're using the wrong events.

Do this

[tex]\begin{eqnarray*}
P(X_{n+2} = 0~\vert~X_n=2)
& = & P(X_{n+2} = 0~\vert~X_n=2,~X_{n+1}=0)P(X_{n+1}=0~\vert~X_n=2)\\
& & + P(X_{n+2} = 0~\vert~X_n=2,~X_{n+1}=1)P(X_{n+1}=1~\vert~X_n=2)\\
& & + P(X_{n+2} = 0~\vert~X_n=2,~X_{n+1}=2)P(X_{n+1}=2~\vert~X_n=2)
\end{eqnarray*}[/tex]
 
  • #5
GreenPrint said:

Homework Statement





Homework Equations





The Attempt at a Solution



I'm not really sure if this belongs here or in the precalculus mathematics section. I had to take calculus before taking this class so I'm putting it here.

I'm confused about part (b). I don't really understand how I'm supposed to find [itex]P(X_{n+2}=0|X_{n}=2)[/itex] because I don't know what state n+1 is. Thanks for any help.

If you had been told that ##X_0 = 2## would you have been able to work out the probability that ##X_2 = 0?## Have you really never seen how to get multi-step transition probabilities?

Note: I am waiting for answers to these questions before offering more help.
 
Last edited by a moderator:
  • #6
Well for b) I got .21 and believed that I solved the problem correctly. I don't know exactly what c is even asking me. Find P(X_1 = 0). What exactly are the alphas? Like what do they represent? Alpha 1 = probability x equals zero is .25.

I have indeed never seen how to get multi-step transition probabilities =( but i believe i figured it out correctly and got the answer.

Thanks for your help guys.
 

FAQ: Statistics - Discrete Markov Chains

1. What is a discrete Markov chain?

A discrete Markov chain is a mathematical model used to describe the probability of transitioning between different states over time. It is a stochastic process, meaning that future states are dependent on the current state and not influenced by previous states.

2. What are the key components of a discrete Markov chain?

The key components of a discrete Markov chain are the states, transition probabilities, and initial state distribution. States represent the different possible outcomes or conditions, transition probabilities describe the likelihood of moving from one state to another, and the initial state distribution represents the starting point of the chain.

3. How is a discrete Markov chain different from a continuous Markov chain?

The main difference between a discrete Markov chain and a continuous Markov chain is the time intervals between state transitions. In a discrete Markov chain, the time intervals are fixed and the states are updated at specific points in time. In a continuous Markov chain, the time intervals are continuous and the states can change at any point in time.

4. What are some common applications of discrete Markov chains?

Discrete Markov chains have a wide range of applications in various fields, including finance, biology, and engineering. Some common applications include modeling stock prices, predicting weather patterns, and analyzing genetic sequences.

5. How can you use a discrete Markov chain to make predictions?

Once a discrete Markov chain is constructed, it can be used to make predictions about future states based on the current state and transition probabilities. By simulating the chain over multiple iterations, the most likely sequence of states can be determined, allowing for predictions to be made about future outcomes.

Similar threads

Replies
5
Views
2K
Replies
1
Views
1K
Replies
12
Views
1K
Replies
12
Views
1K
Replies
4
Views
421
Replies
10
Views
349
Replies
12
Views
1K
Replies
4
Views
1K
Back
Top