- #1
Siron
- 150
- 0
Hi,
I have some troubles understanding the definition of the Markov property in the general case since I'm struggling with conditional expectations.
Let $(X(t), t \in T)$ be a stochastic process on a filtered probability space $(\Omega, \mathcal{F}, \mathcal{P})$ with adapted filtration $(\mathcal{F}_t, t \in T)$ and state space $S$. The process is called a Markov process if for any $s,t \in T$ and for any $A \in S$:
$$\mathbb{P}(X_t \in A | \mathcal{F}_t) = \mathbb{P}(X_t \in A | X_s)$$
This definition is not clear to me. Can someone explain this?
In case the state space $S$ is discrete and $T = \mathbb{N}$ the Markov property can be formulated as
$$\mathbb{P}(X_n=x_n|X_{n-1}=x_{n-1},\ldots,X_0=x_0) = \mathbb{P}(X_n=x_n|X_{n-1}=x_{n-1})$$
This definition is clear. Is there a link with the continuous (general) definition?
Furthermore there's also a definition involving conditional expectations where the Markov property is stated as follows:
$$\mathbb{E}[f(X_t)|\mathcal{F}_s] = \mathbb{E}[f(X_t)|\sigma(X_s)]$$
this definition is also not clear. What's the link with the other definitions? Does someone have a proof for this version?
Thanks in advance!
Cheers,
Siron
I have some troubles understanding the definition of the Markov property in the general case since I'm struggling with conditional expectations.
Let $(X(t), t \in T)$ be a stochastic process on a filtered probability space $(\Omega, \mathcal{F}, \mathcal{P})$ with adapted filtration $(\mathcal{F}_t, t \in T)$ and state space $S$. The process is called a Markov process if for any $s,t \in T$ and for any $A \in S$:
$$\mathbb{P}(X_t \in A | \mathcal{F}_t) = \mathbb{P}(X_t \in A | X_s)$$
This definition is not clear to me. Can someone explain this?
In case the state space $S$ is discrete and $T = \mathbb{N}$ the Markov property can be formulated as
$$\mathbb{P}(X_n=x_n|X_{n-1}=x_{n-1},\ldots,X_0=x_0) = \mathbb{P}(X_n=x_n|X_{n-1}=x_{n-1})$$
This definition is clear. Is there a link with the continuous (general) definition?
Furthermore there's also a definition involving conditional expectations where the Markov property is stated as follows:
$$\mathbb{E}[f(X_t)|\mathcal{F}_s] = \mathbb{E}[f(X_t)|\sigma(X_s)]$$
this definition is also not clear. What's the link with the other definitions? Does someone have a proof for this version?
Thanks in advance!
Cheers,
Siron
Last edited: