Sample space, outcome, event, random variable, probability...

In summary: S##).When an event happens/occurs, it means that just one of the outcomes in the set ##E## has occurred/materialized.The rule behind the random variable ##X## is really to assign a number to events whose outcomes have something in common, in this case their sum being the same number...That sounds convoluted...Thank you for any clarification.The meaning of a "random variable" can range from simple to abstract. (An abstract definition in terms of measurable function from a sample space to a measure space can be found here.) At this point, I don't think
  • #1
fog37
1,569
108
TL;DR Summary
Understand the correct relation between random variable and events...
Hello,
I am solid on the following concepts but less certain on the correct understanding of what a random variable is...
  • Random Experiment: an experiment that has an uncertain outcome.
  • Trials: how many times we sequentially repeat a random experiment.
  • Sample space ##S##: the set of ALL possible individual outcomes of a random experiment. When we perform the experiment, we may know which outcomes may happen but don't know with certainty which one of those outcomes will happen.
  • Event ##E##: an event ##E## is either a single outcome (called elementary event) or a collection of outcomes from the sample space ( called composite event, i.e. a subset of sample space ##S##).
  • When an event happens/occurs, it means that just one of the outcomes in the set ##E## has occurred/materialized.
  • Probability: a real-valued function that associates a real number between 0 and 1 to an event ##E##. So different events may have different probabilities. The higher the number the more likely it is for that event ##E## to occur. So probability is a function that associates real numbers to different events.
  • Random variable r.v. : also a real-valued function. I would say that it also associates a real number to different events....Is that correct?
Example:
  • Random experiment: throwing two fair dies.
  • Number of Trials: 1
  • Sample space ##S={(1;1) , (2;1) , (3;1) , ..., (6;6)} ##.
  • Number of outcomes: 36
  • Random variable ##X##: "the sum of the faces, face i and face j, of the two dies" . Is this statement really representing the rule ##X: i+j##? Or is a random variable ##X## a real-value function whose input is a set of particular events ##E##? A random variable clearly associates a number to different events. But also the probability function associates a number, the probability value, to different events. Is a random variable simply a function that "relabels" particular events?
1680911640328.png

##X## labels the elementary event ##{(1;1)}## as ##2##
##X## labels the composite event ##{(2;1), (1;2)}## as ##3##
##X## labels the composite event ##{(3;1), (1;3), (2;2)}## as ##4##

The rule behind the random variable ##X## is really to assign a number to events whose outcomes have something in common, in this case their sum being the same number...That sounds convoluted...

Thank you for any clarification.
 
Physics news on Phys.org
  • #2
The meaning of a "random variable" can range from simple to abstract. (An abstract definition in terms of measurable function from a sample space to a measure space can be found here.) At this point, I don't think you will gain anything from worrying about the abstract definition. The immediate goal is to learn about the mean, variance, and expected values of real-valued random variables. In that context, you are almost correct.
To be more precise, it is assigning real numbers to events in the sample space.
EDIT: With more thought, it is not clear to me that the requirement that the random variable be a measurable function does not mean that there are more significant requirements.
 
Last edited:
  • Like
Likes fog37
  • #3
An alternate approach which may help is looking at probability as functions on a measure space where the total measure is one. Random variable is a function. Probability is length of a domain of interest.
 
  • Like
Likes fog37
  • #4
The word random variable confused me a lot at first. Anything which is random and returns a real number (or vector in space, or whatever) is a random variable. Every probability distribution is naturally coupled with a random variable - the value that the distribution takes. In some sense this is the only random variable - if ##X## is the uniform random variable on [0,1] (notice it's a random variable), then ##\sin(X)## is also a random variable, which you can also think of assigning to each outcome some probability, and hence this is not different from ##X## in any way except the probability density function is different.
 
Last edited:
  • Skeptical
  • Like
Likes fog37 and FactChecker
  • #5
I would say that sometimes we are interested in a certain outcome but sometimes we are interested in some function of the outcome, for example the sum of in a throw of two fair dice.

For example, ##X## = sum is equal to 7 is a real-valued function that takes as input the specific and composite event ##E_7= {(1,6), (2,5), (4,3), (5,2), (6,1)}## and outputs/assigns to it the number 7. The occurrence of any one the elementary events in ##E_7## will trigger the random variable to get the value 7: ##X=7##

Another way to think about a random variable, in my book, is as the "numerical" outcome of a random experiment (I usually think of an event, subset of sample space, as being the outcome of a random experiment).

Question: does throwing a coin multiple times represents multiple trials of the same random experiment? Does each throw represent the realization of a different random variable or is it the same random variable?
 
  • #6
We have to distinguish between the mathematical theory of probability versus its common interpretations when applied to practical situations.

fog37 said:
  • Event ##E##: an event ##E## is either a single outcome (called elementary event) or a collection of outcomes from the sample space ( called composite event, i.e. a subset of sample space ##S##).

fog37 said:
  • When an event happens/occurs, it means that just one of the outcomes in the set ##E## has occurred/materialized.

In the mathematical theory, there is indeed a set of points, which may be called "outcomes", but there is no definition for an outcome "occuring". It is possible that not all subsets of the set of outcomes can be assigned a probability. For example, if the set of outcomes is the set of real numbers, there is the phenomenon of "non measurable" subsets. This is a technical topic - about which I have no intuitive appreciation! In practical applications, the subsets we care about can be assigned probabilities. However, in practical situations we sometimes cannot determine which outcome occurs. For example, if a voltage is represented by a real number, we often measure it with only finite precision.
fog37 said:
  • Probability: a real-valued function that associates a real number between 0 and 1 to an event ##E##. So different events may have different probabilities. The higher the number the more likely it is for that event ##E## to occur. So probability is a function that associates real numbers to different events.

In the mathematical theory, a probability function need only assign a real number between 0 and 1 to a subset of the collection of all possible subsets of outcomes. This subset is called a "sigma algebra of subsets". In cases where there are only a finite number of outcomes, the sigma algebra we use in practical applications does consist of all possible subsets of outcomes.

A probability function (called "probability measure" in the mathematical theory) has the additional requirements that it assigns 0 to the null set of outcomes and 1 to the complete set of outcomes.
fog37 said:
  • Random variable r.v. : also a real-valued function. I would say that it also associates a real number to different events....Is that correct?

In the mathematical theory theory, a random variable is a "measurable function". That definition involves technicalities that I'd myself would have to look up. For practical applications, your definition is correct.

The higher the number the more likely it is for that event to occur.

Thinking about the meaning of that statement reveals the distinction between probability theory versus applications of it. The phrase "more likely it is for that event to occur" seems to suggest that a high probability of an event is some sort of guarantee that the event definitely will occur many times (in repeated independent experiments). However, such a firm guarantee would contradict the concept of probability!

In practice, we develop probability models where the predicted probabilities for events are approximately the same as the observed frequencies of those events. Essentially, we assume that we will have at least "average luck" with our observed frequencies. However, mathematical probability theory itself makes no firm guarantees about observed frequencies.
 
Last edited:
  • Like
Likes FactChecker and fog37
  • #7
Thank you! Fascinating topic!
I am getting more and more clarity on these matters..
 

FAQ: Sample space, outcome, event, random variable, probability...

What is a sample space in probability theory?

A sample space is the set of all possible outcomes of a random experiment. It encompasses every potential result that could occur, providing a comprehensive list of what might happen. For example, for a coin toss, the sample space is {Heads, Tails}.

What is an outcome in the context of probability?

An outcome is a single possible result of a random experiment. It is an element within the sample space. For instance, when rolling a six-sided die, each face (1, 2, 3, 4, 5, 6) represents a different outcome.

What is an event in probability theory?

An event is a subset of the sample space that includes one or more outcomes. Events can be simple (consisting of a single outcome) or compound (consisting of multiple outcomes). For example, in rolling a die, the event of rolling an even number includes the outcomes {2, 4, 6}.

What is a random variable?

A random variable is a function that assigns a numerical value to each outcome in the sample space of a random experiment. There are two main types: discrete random variables, which have countable values, and continuous random variables, which have values in a continuous range. For example, the number of heads in three coin tosses is a discrete random variable.

What is probability and how is it calculated?

Probability is a measure of the likelihood that a particular event will occur. It is calculated by dividing the number of favorable outcomes by the total number of possible outcomes in the sample space. For example, the probability of rolling a 3 on a six-sided die is 1/6, as there is one favorable outcome and six possible outcomes.

Similar threads

Replies
9
Views
2K
Replies
2
Views
2K
Replies
5
Views
2K
Replies
30
Views
3K
Replies
10
Views
1K
Replies
1
Views
1K
Back
Top