Question about sample spaces (in probability)

  • Thread starter AxiomOfChoice
  • Start date
  • Tags
    Probability
In summary, a stochastic process is a sequence of random variables, which are real-valued functions on the sample space \Omega of a probability triple (\Omega, \mathcal M, \mathbb P). For the simple random walk and Brownian motion, the sample spaces are constructed using Kolmogorov's Existence Theorem, and the probability measures on these spaces are more complicated. The Brownian motion can also be viewed as a deterministic function at any fixed point \omega \in \Omega.
  • #1
AxiomOfChoice
533
1
So a stochastic process (e.g., the simple random walk) is defined as a sequence of random variables. And random variables are defined as real-valued (measurable) functions on the sample space [itex]\Omega[/itex] of some probability triple [itex](\Omega, \mathcal M, \mathbb P)[/itex].

Question: In the case of the simple random walk - which is a sequence [itex]X_1, X_2, \ldots[/itex] of i.i.d. random variables such that [itex]\mathbb P (X_i = \pm 1) = 1/2[/itex], what on Earth is the sample space on which these random variables are defined? My best guess is that it's the set of all sequences of the form [itex](\pm 1, \pm 1, \pm 1, \ldots)[/itex], such that [itex]X_i[/itex] just spits out the [itex]i[/itex]th entry.
 
Last edited:
Mathematics news on Phys.org
  • #2
Hmmm...now that I think about it, I guess I have the same basic question about simple Brownian motion [itex]B(t)[/itex] in one dimension, which has a continuous index and continuous state space. What's its sample space?
 
  • #3
You are asking an extremely good question. You'll need to find an [tex]\Omega[/tex] such that there exists functions [tex]X_i:\Omega\rightarrow \mathbb{R}[/tex]. This is not trivial at all!

In fact, constructing suitable sample spaces can be quite a difficult task. Let me construct the sample space of the random variables in your OP. You will see right away that this is not a trivial task.

For [tex]n\geq 1[/tex] and [tex](\epsilon_1,...,\epsilon_n)\in \{0,1\}^n[/tex], we put

[tex]I^{(n)}_{\epsilon_1,...,\epsilon_n}=]\sum_{i=1}^n{\epsilon_i2^{-i}},2^{-n}+\sum_{i=1}^n{\epsilon_i2^{-i}}][/tex]

Put [tex]\Omega=[0,1][/tex] with the ordinary sigma-algebra and Lebesgue-measure. Then we set [tex]B_1=I_1^{(1)}[/tex] and [tex]B_n=\bigcup_{(\epsilon_1,...,\epsilon_{n-1})\in \{0,1\}^{n-1}}{I_{\epsilon_1,...,\epsilon_{n-1},1}^{(n)}}[/tex] for n>1.

Then [tex]P(B_n)=1/2[/tex]. We put

[tex]X_n(x)=\left\{\begin{array}{c}1~\text{if}~x\in B_n\\ -1~\text{if}~x\notin B_n\end{array}\right.[/tex]

This will be the random variables we're looking for. They are identically distributed, and also independent because

[tex]P(\{X_1=\epsilon_1,...,X_n=\epsilon_n\})=2^{-n}[/tex] for [tex](\epsilon_1,...,\epsilon_n)\in \{0,1\}^n[/tex].

Finding the sample space of the Brownian motion is also possible, but even more complicated!
This is why probability theorists will simply not care about the sample space. They will show that the sample space exists, and that's all they care about. The only thing important for probability is the functions [tex]X_n[/tex]...
 
  • #4
micromass said:
You are asking an extremely good question. You'll need to find an [tex]\Omega[/tex] such that there exists functions [tex]X_i:\Omega\rightarrow \mathbb{R}[/tex]. This is not trivial at all!

In fact, constructing suitable sample spaces can be quite a difficult task. Let me construct the sample space of the random variables in your OP. You will see right away that this is not a trivial task.



Finding the sample space of the Brownian motion is also possible, but even more complicated!
This is why probability theorists will simply not care about the sample space. They will show that the sample space exists, and that's all they care about. The only thing important for probability is the functions [tex]X_n[/tex]...
Wow! Interesting. Thanks. I've got a few more questions, though:

(1) What is wrong with my suggestion about the sample space for the simple random walk?

(2) Is Kolmogorov's Existence Theorem concerned with proving the sample space exists (for, say, Brownian motion)?
 
  • #5
AxiomOfChoice said:
(1) What is wrong with my suggestion about the sample space for the simple random walk?

Ah, I'm sorry, I forgot that you had your own suggestion. Well, your suggestion is a very good [tex]\Omega[/tex], but that's not enough for a sample space. You'll also need a sigma-algebra on it, but I guess you'll take [tex]\mathbb{M}=\mathcal{P}(\Omega)[/tex]. But you will also need a probability measure on [tex]\Omega[/tex], and I don't really see how you can define a good probability measure on that space...

(2) Is Kolmogorov's Existence Theorem concerned with proving the sample space exists (for, say, Brownian motion)?

Indeed, the Kolmogorov's Existence Theorem states something similar. It provides a suitable probability function on [tex](\mathbb{R}^T,\mathcal{R}^T)[/tex] such that certain distributions are coming from this sample space. So yes, it provides the existence of a suitable sample space... And the Kolmogorov's Existence theorem is indeed used in the proof that the Brownian motion exists. A quick look at the proof reveals that the sample space of the Brownian motion is again the unit interval with the Borel sigma algebra. But I guess that the probability measures on that interval will be more complicated...
 
  • #6
micromass said:
A quick look at the proof reveals that the sample space of the Brownian motion is again the unit interval with the Borel sigma algebra. But I guess that the probability measures on that interval will be more complicated...

I'd love to see the proof. Where are you looking? Is it in a book? I might be able to get it at my school's library...of course, if it's online, that's even better, since you could just link me to it :smile: Thanks, again, for your help!
 
  • #7
AxiomOfChoice said:
I'd love to see the proof. Where are you looking? Is it in a book? I might be able to get it at my school's library...of course, if it's online, that's even better, since you could just link me to it :smile: Thanks, again, for your help!

Check out "Probability and Measure" by Billingsley. Your school's library should have it, since it is considered by many to be the best book on probability there is (although it's probably not suitable for very beginners).

In theorem 5.2, Billingsley constructs sample spaces for several random variables.
In theorem 8.1, Billingsley constructs sample spaces for Markov chains.
In theorem 37.1, Billingsley constructs the Brownian motion, the proof is quite long however.

You can find Billingsley's book on the internet. But I guess that the mentors won't like me posting the link since it's an illegal download...
 
  • #8
They actually *do* have Billingsley, so I'll head over and check it out. Thanks!

Here's another quick question, just to clarify: If we consider Brownian motion [itex]B(t)[/itex] as a continuously indexed stochastic process on [itex](\Omega,\mathcal M, \mathbb P)[/itex] (where, as you said, [itex]\Omega = [0,1][/itex] and [itex]\mathcal M[/itex] is the Borel sets), and we look at it from the vantage point of [itex]t \mapsto B(\omega,t)[/tex] for fixed [itex]\omega \in \Omega[/itex], we then have something deterministic, right (at least in principle)? I mean, for any fixed [itex]\omega[/itex], we know how [itex]B(t)[/itex] acts on it, right?
 
  • #9
Yes, if we know [tex]\omega[/tex], then we know [tex]B(t,\omega)[/tex] for any t. But in Brownian motion research, we don't know [tex]\omega[/tex]. It's the same with any random variable X, really. If we know [tex]\omega[/tex], then we of course know [tex]X(\omega)[/tex], but can we infer information about X without knowing [tex]\omega[/tex] explicitly, that's the question.

I hope I didn't misunderstood your question...
 
  • #10
micromass said:
Yes, if we know [tex]\omega[/tex], then we know [tex]B(t,\omega)[/tex] for any t. But in Brownian motion research, we don't know [tex]\omega[/tex]. It's the same with any random variable X, really. If we know [tex]\omega[/tex], then we of course know [tex]X(\omega)[/tex], but can we infer information about X without knowing [tex]\omega[/tex] explicitly, that's the question.

I hope I didn't misunderstood your question...

No, I don't think you did. It's just that when one sees "plots" of Brownian motion...one is immediately led to ask, "How did they get the plot?" There has to be something deterministic there. And from what I can tell, this is done by fixing an [itex]\omega[/itex] and then plotting [itex]t[/itex] on the horizontal axis, [itex]B(t|\omega)[/itex] on the vertical axis. (I understand that it's not strictly speaking possible to plot a true Brownian motion; what I'm assuming is done is that one of the discrete approximations to it is plotted.) I think this is what is meant by associating with Brownian motion the notion of a "random function".
 
  • #11
I think that the easiest way to get plots of Brownian motion is by some kind of Monte Carlo method. That is, instead of calculating the Brownian motion, we approximate the rules of the Brownian motion and plot a function that looks like it.

Of course, the Brownian motion is deterministic in the sense that if we know [tex]\omega[/tex], then we can graph [tex]B(\cdot,\omega)[/tex]. But I can see a number of possible objections:
1) You don't know [tex]\omega[/tex] in the usual applications, you can only approximate it
2) The formula for the Brownian motion is probably very difficult or impossible to write down (if the proof of existence is nonconstructive). You should check the proof of the existence, but I've never seen an easy formula for the Brownian motion...

I'm also by far no expert on Brownian motions, so don't simply believe everything I say...
 
  • #12
micromass said:
I think that the easiest way to get plots of Brownian motion is by some kind of Monte Carlo method. That is, instead of calculating the Brownian motion, we approximate the rules of the Brownian motion and plot a function that looks like it.

Of course, the Brownian motion is deterministic in the sense that if we know [tex]\omega[/tex], then we can graph [tex]B(\cdot,\omega)[/tex]. But I can see a number of possible objections:
1) You don't know [tex]\omega[/tex] in the usual applications, you can only approximate it
2) The formula for the Brownian motion is probably very difficult or impossible to write down (if the proof of existence is nonconstructive). You should check the proof of the existence, but I've never seen an easy formula for the Brownian motion...

I'm also by far no expert on Brownian motions, so don't simply believe everything I say...
Okay, fair enough. But when you hear someone talking about the interesting properties of Brownian motion - i.e., that the paths are (almost surely) nowhere differentiable, or that they are locally Hoelder continuous for any Hoelder constant less than 1/2 - they're implicitly fixing an [itex]\omega \in \Omega[/itex] and talking about the behavior of [itex]B(t|\omega)[/itex] as a function on [itex][0,\infty)[/itex], right? I mean...when you say "almost surely," aren't you talking about what happens for all [itex]\omega\in \Omega[/itex] except for possibly [itex]\omega \in A[/itex] where [itex]\mathbb P(A) = 0[/itex]?
 
Last edited:
  • #13
Yes, indeed, in that cases, they fix an omega, and they regard [tex]B(t,\omega)[/tex]. For allmost all [tex]\omega[/tex], this function [tex]B(\cdot,\omega)[/tex] has weird properties, like nowhere differentiable and stuff.
 
  • #14
AxiomOfChoice said:
one is immediately led to ask, "How did they get the plot?" There has to be something deterministic there. And from what I can tell, this is done by fixing an [itex]\omega[/itex] and then plotting [itex]t[/itex] on the horizontal axis, [itex]B(t|\omega)[/itex] on the vertical axis. (I understand that it's not strictly speaking possible to plot a true Brownian motion; what I'm assuming is done is that one of the discrete approximations to it is plotted.) I think this is what is meant by associating with Brownian motion the notion of a "random function".

Writing a computer program to plot Brownian motion to a time resolution input by the user is an excellent way to get an intuitive understanding of it. The first idea that would pop into the average computer progammer's head is to have a step size (say deltaY = 1) for the time interval deltaT = 1. Pick a random jump of plus or minus deltaY at intervals of deltaT. Plot them and plot another graph of the sum of all the current jumps at time T (I think this sum would be "The Wiener Process" associated with the Brownian motion.

If the user wants to see Brownian motion at increments of deltaT = 1/2 the natural mistake would be to make deltaY = 1/2 and do the same type of graph. However, once the user inputs a very small deltaT, say 1/1000, the error in using the proportional deltaY = 1/1000 will be obvious since the graphs of the sums of the deltaY's at time T won't look right. (I.e. they won't look as variable as they did when deltaY = 1/2 and deltaT = 1/2, so the new process won't be a plausible "interpolation" of the coarser process.

The computer programmer who has made this mistake will start down the path toward discovering a formula for how deltaY ought to be reduced when deltaT is reduced. Once that formula is found, Brownian motion can be thought about as a limit of a sequence of stochastic processes, each using a smaller and smaller deltaT and each being a plausible interpolation of its predecessor.
 

FAQ: Question about sample spaces (in probability)

1. What is a sample space in probability?

A sample space in probability is the set of all possible outcomes of a random experiment. It is denoted by the symbol "S" and is an essential concept in probability theory.

2. How is a sample space determined?

A sample space is determined by considering all the possible outcomes of a random experiment. These outcomes can be listed, organized into a table, or represented using a tree diagram.

3. Can a sample space change for the same experiment?

No, a sample space is unique for a given experiment and does not change. However, the outcomes within the sample space may change if the experiment is conducted multiple times or if different conditions are applied.

4. What is the relationship between sample space and events in probability?

Events are subsets of the sample space, meaning they are a collection of outcomes from the sample space. The sample space contains all possible outcomes, while events represent specific outcomes or combinations of outcomes within the sample space.

5. Why is understanding sample space important in probability?

Understanding sample space is crucial in probability because it helps predict the likelihood of different outcomes and determine the probability of events occurring. It also provides a foundation for more complex concepts in probability, such as conditional probability and independence.

Similar threads

Replies
5
Views
1K
Replies
1
Views
1K
Replies
3
Views
2K
Replies
1
Views
611
Replies
2
Views
851
Replies
9
Views
1K
Replies
1
Views
649
Back
Top