- #1
steven187
- 176
- 0
Hi all,
Im currently researching into stochastic proesses. The gaussian process wasnt hard to tackle, However, I don't understand the Markov process. See I understand that a stochastic process is a family of random variable's which is dependent and distinguished upon another variable. But with a Markov process I can't see where this family of random variables like the way I see it in the Gaussian process. How could I understand this graphically?
I also realize that we need to sets of information, is it the initial distribution or the initial point and the transition probability?
Another thing I don't understand is if these stochastic processes are related to time how are we suppose to know the distribution at a particular point in time if only one thing can occur at a particular point in time?
Please help,
Regards
Steven
Im currently researching into stochastic proesses. The gaussian process wasnt hard to tackle, However, I don't understand the Markov process. See I understand that a stochastic process is a family of random variable's which is dependent and distinguished upon another variable. But with a Markov process I can't see where this family of random variables like the way I see it in the Gaussian process. How could I understand this graphically?
I also realize that we need to sets of information, is it the initial distribution or the initial point and the transition probability?
Another thing I don't understand is if these stochastic processes are related to time how are we suppose to know the distribution at a particular point in time if only one thing can occur at a particular point in time?
Please help,
Regards
Steven