Examples of Measures in context of probability

  • Thread starter woundedtiger4
  • Start date
  • Tags
    Probability
In summary, probability measures are functions that assign a number between 0 and 1 to each set in a sigma algebra of events. Probability mass functions and probability density functions can be used to compute probability measures, but they are not probability measures themselves. Examples of measures in probability include "ordinary" probability, Baire measure, and Wiener measure, which are all variations on the theme of probability measure.
  • #1
woundedtiger4
188
0
Hi all,
Can someone please give me few examples of a measure in context of probability, like in text it says:
A probability measure P over discrete set of events is basically what as know as a probability mass function. For example given probability measure P and two sets A,B ε β, we can familiarly write
P(B\A) = P(A ∩ B)/P(A) .

is the probability mass function only measure that we deal in measure theory (in context of probability) or there exist more measures in probability?
thanks in advance.
 
Physics news on Phys.org
  • #2
In probability theory continuous distributions are also studied. The subject involves anything concerning measure, where the total measure is one.
 
  • #3
Woundedtiger, I don't know your backgrounds in measure theory, but essentially, modern probability is whole measure-oriented (thanks to Kolmogorov). One exploits counting measures (e.g. probability of discrete random variables, pmfs,...), Lebesgue measures (probability of continuous random variables). Radon measure also represent probability with some special requirements. If you know the Radon-Nikodym theorem, then probability density (or mass) functions are measures defined as Radon-Nikodym derivatives of distributions wrt dominating measures (usually Lebesgue or counting).
 
  • #4
Every "probability mass function" (the anti-derivative of the probabilty density function) can be interpreted as a measure and (after "normalizing" to get the requirement that the total probability be 1) every measure can be interpreted as a probability mass function.
 
  • #5
HallsofIvy said:
Every "probability mass function" (the anti-derivative of the probabilty density function) can be interpreted as a measure and (after "normalizing" to get the requirement that the total probability be 1) every measure can be interpreted as a probability mass function.

Shouldn't there be 'cumulative distribution function' or 'distribution', if talking about antiderivative of pdf? pmf is (in a weak sense) a discrete counterpart of pdf and, if thinking of pdf in the Radon-Nikodym sense wrt dominating measure, pdf and pmf coincide, I think.
 
  • #6
My understanding was that what woundedtiger4 was calling the "probability mass function" was what I (and you, apparently) would call the "cumulative distribution function".
 
  • #7
Well, I would expect that the conditional probability which woundedtiger4 expresses by his P(B|A) = ... is in terms of pdfs (pmfs) in their proper meaning, not cdfs. However, the OP is a bit unclear.

Generally, I'm very afraid of nomenclature abuse and if this is the case, I'd suggest woundedtiger to review what pdf/cdf/pmf mean.
 
  • #8
HallsofIvy said:
Every "probability mass function" (the anti-derivative of the probabilty density function) can be interpreted as a measure and (after "normalizing" to get the requirement that the total probability be 1) every measure can be interpreted as a probability mass function.

Do you mean that the probability density function is anti-derivative but I studied it in class of integration:
eq0005MP.gif

Please correct me if I am wrong.

camillio said:
Woundedtiger, I don't know your backgrounds in measure theory, but essentially, modern probability is whole measure-oriented (thanks to Kolmogorov). One exploits counting measures (e.g. probability of discrete random variables, pmfs,...), Lebesgue measures (probability of continuous random variables). Radon measure also represent probability with some special requirements. If you know the Radon-Nikodym theorem, then probability density (or mass) functions are measures defined as Radon-Nikodym derivatives of distributions wrt dominating measures (usually Lebesgue or counting).
Hi, I am self-taught, & have no formal guidance except I discuss my concept here on this great website or otherwise at openstudy, I am reading kolmogorov's intro to real analysis at the moment & so far I am only 2 chapters behind to finish the book, I haven't studied the radon-nikodym thm yet, hopefully in a week I will reach to this topic. Actually, I am trying to study measure theory in context of probability, I am also studying a book on probability by Henk Tjims but I am quite slow in reading it & still on first chapter from past 4 days, after reading PMF I found it quite interesting, and then I found a summary notes on measure theory on google in which it was written that PMF is a measure then I got much better understanding of Lebesgue integral (in context of it's domain, working procedure etc) therefore I was wondering if someone can share more measures (of probability) so that I can have strengthen up my basic concepts of this extremely interesting subject.
 
  • #9
Hi woundedtiger4, I like your approach :-)

Well, there are several measures in the probability theory and related theories, e.g.:
  • "ordinary" probability, i.e. measure defined on probability space equipped with sigma algebra of events;
  • one often speaks about Baire measure if P is regular finite additive measure on a Borel sigma algebra of a topology space E, B(E). It's for instance of interest in stochastic processes.
  • it coincides with Borel measure on metric spaces.
  • Wiener measure is a measure associated with Brownian motion (aka Wiener process).

Some other measures were given in previous posts. Those listed in this post are often variations on the theme of probability measure.
 
  • #10
woundedtiger4 said:
Hi all,
Can someone please give me few examples of a measure in context of probability, like in text it says:
A probability measure P over discrete set of events is basically what as know as a probability mass function.

We can let the text slide by since it says "is basically" instead of "is" Neither the probability density function nor the cumulative distribution function of a random variable are probability measures. They can be used to compute probability measures and perhaps it isn't much of an exaggeration to say that the probability density function "is basically" a measure.

Let U be the uniform distribution on the interval [0,1/2]. So the probability density function of U is f(x) = 2. A probability measure is defined as a function that assigns a number between 0 and 1 to each set in a sigma algebra of sets. I won't try to get into the technicalities of a sigma algebra. Let's just look at a particular set. Let s be the set consisting of the union of the intervals [0,1/8] and [3/8,1/2]. We can compute the probability that a realization of U will fall in the set s by computing [itex] \mu(s) = \int_0^{1/8} f(x) dx + \int_{3/8}^{1/2} f(x) dx [/itex] This process assigns a number ( a probability) to the set s. For many other types of sets s, we can apply the same sort of process by integrating the probability density function over them. The "probability measure" is the function defined by this process. It is a function [itex] \mu(s) [/itex] whose argument[itex] s [/itex] is a set , not a number. The function defined by this process is more general than the cumulative distribution function. The cumulative distribution function F(x) only assigns probabilities to sets of the form [itex] (-\infty, x] [/itex]. The process [itex] \mu(s) [/itex] involves using the probability density function, but it is not the same function as the probability density function.



For example given probability measure P and two sets A,B ε β, we can familiarly write
P(B\A) = P(A ∩ B)/P(A) .

I dont' know what that example has to do with the original question. It's actually very hard to define conditional probabilities in the context of measure theory.

is the probability mass function only measure that we deal in measure theory (in context of probability) or there exist more measures in probability?
thanks in advance.

There exist more general measures.

In elementary probability texts, you encounter two types of random variables. Those that take discrete values have "probability mass functions". Those that take a continuum of values have a "probability density function". Neither type of functions "is" a probability measure but both types can be used to define probability measures.

In the case of a discrete random variable, you define the process [itex] \mu(s) [/itex] by summation instead of integration. For example let X be the random variable with PMF given by f(0) = 1/3, f(1) = 2/3 and f(x) = 0 otherwise. Let s be the set [-1,1/2]. To compute [itex] \mu(s) [/itex] you add up all the non-zero values of f that occur in that interval (getting an answer of 1/3 in this example).

There are situations in elementary probability that involve random variables that are not purely discrete and not purely continuous. For example, define the random variable X as follows. Flip a fair coin. If the result is heads then X = 1/2. If the result is tails then set X equal to the realization of a uniform random variable on the interval [0,1]. If you want to know the probability that X falls in the interval s = [1/3,2/3], you can't get the right answer by doing the type of integral used in introductory calculus and you can't get the right answer by a summation. You need a combination of both methods. A commonsense person would compute the answer as [itex] \mu(s) =\frac{1}{2}( 1/2) + \frac{1}{2} \int_{1/3}^{2/3} 1 dx [/itex]. If we grant that a commonsense person can find a procedure for every such set s in a sigma algebra of sets, then his method defines a probability measure.
 
Last edited:
  • #11
Stephen Tashi said:
We can let the text slide by since it says "is basically" instead of "is" Neither the probability density function nor the cumulative distribution function of a random variable are probability measures. They can be used to compute probability measures and perhaps it isn't much of an exaggeration to say that the probability density function "is basically" a measure.

Let U be the uniform distribution on the interval [0,1/2]. So the probability density function of U is f(x) = 2. A probability measure is defined as a function that assigns a number between 0 and 1 to each set in a sigma algebra of sets. I won't try to get into the technicalities of a sigma algebra. Let's just look at a particular set. Let s be the set consisting of the union of the intervals [0,1/8] and [3/8,1/2]. We can compute the probability that a realization of U will fall in the set s by computing [itex] \mu(s) = \int_0^{1/8} f(x) dx + \int_{3/8}^{1/2} f(x) dx [/itex] This process assigns a number ( a probability) to the set s. For many other types of sets s, we can apply the same sort of process by integrating the probability density function over them. The "probability measure" is the function defined by this process. It is a function [itex] \mu(s) [/itex] whose argument[itex] s [/itex] is a set , not a number. The function defined by this process is more general than the cumulative distribution function. The cumulative distribution function F(x) only assigns probabilities to sets of the form [itex] (-\infty, x] [/itex]. The process [itex] \mu(s) [/itex] involves using the probability density function, but it is not the same function as the probability density function.





I dont' know what that example has to do with the original question. It's actually very hard to define conditional probabilities in the context of measure theory.



There exist more general measures.

In elementary probability texts, you encounter two types of random variables. Those that take discrete values have "probability mass functions". Those that take a continuum of values have a "probability density function". Neither type of functions "is" a probability measure but both types can be used to define probability measures.

In the case of a discrete random variable, you define the process [itex] \mu(s) [/itex] by summation instead of integration. For example let X be the random variable with PMF given by f(0) = 1/3, f(1) = 2/3 and f(x) = 0 otherwise. Let s be the set [-1,1/2]. To compute [itex] \mu(s) [/itex] you add up all the non-zero values of f that occur in that interval (getting an answer of 1/3 in this example).

There are situations in elementary probability that involve random variables that are not purely discrete and not purely continuous. For example, define the random variable X as follows. Flip a fair coin. If the result is heads then X = 1/2. If the result is tails then set X equal to the realization of a uniform random variable on the interval [0,1]. If you want to know the probability that X falls in the interval s = [1/3,2/3], you can't get the right answer by doing the type of integral used in introductory calculus and you can't get the right answer by a summation. You need a combination of both methods. A commonsense person would compute the answer as [itex] \mu(s) =\frac{1}{2}( 1/2) + \frac{1}{2} \int_{1/3}^{2/3} 1 dx [/itex]. If we grant that a commonsense person can find a procedure for every such set s in a sigma algebra of sets, then his method defines a probability measure.

thanks a tonne for very detailed explanation.
 

Related to Examples of Measures in context of probability

What is a measure in the context of probability?

A measure in the context of probability refers to a numerical representation of the likelihood of an event occurring. It is used to quantify the uncertainty associated with an event or outcome.

What are some examples of measures in the context of probability?

Some examples of measures in the context of probability include the probability of an event occurring, the odds of an event happening, and the expected value of an outcome.

How are measures used in probability calculations?

Measures are used in probability calculations to determine the likelihood of an event occurring. They can be used to make predictions about future outcomes and to evaluate the effectiveness of different strategies or decisions.

Are there different types of measures in the context of probability?

Yes, there are different types of measures in the context of probability, such as measures of central tendency (mean, median, mode), measures of variability (standard deviation, variance), and measures of association (correlation, covariance).

Why are measures important in the study of probability?

Measures are important in the study of probability because they provide a way to quantify and compare the likelihood of different outcomes. They allow us to make informed decisions based on the level of uncertainty associated with an event or situation.

Similar threads

Replies
12
Views
754
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
489
Replies
0
Views
356
  • Topology and Analysis
Replies
9
Views
2K
Replies
11
Views
1K
  • Topology and Analysis
Replies
1
Views
803
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
788
Replies
1
Views
950
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
185
Replies
85
Views
4K
Back
Top