Statistics: E(X) = Integral(0 to infinity) of (1-F(x))dx

In summary: Thanks!In summary, the source claims that if X is nonnegative, then E(X) = Integral(0 to infinity) of (1-F(x))dx, where F(x) is the cumulative distribution function of X. However, this result does not have a proof, and may not be valid for discrete random variables.
  • #1
kingwinner
1,270
0
"If X is non-negative, then E(X) = Integral(0 to infinity) of (1-F(x))dx, where F(x) is the cumulative distribution function of X."

============================

First of all, does X have to be a continuous random variable here? Or will the above result hold for both continuous and discrete random variable X?

Secondly, the source that states this result gives no proof of it. I searched the internet but was unable to find a proof of it. I know that by definition, since X is non-negative, we have E(X) = Integral(0 to infinity) of x f(x)dx where f(x) is the density function of X. What's next?

Thanks for any help!
 
Physics news on Phys.org
  • #2
kingwinner said:
"If X is non-negative, then E(X) = Integral(0 to infinity) of (1-F(x))dx, where F(x) is the cumulative distribution function of X."
...
E(X) = Integral(0 to infinity) of x f(x)dx where f(x) is the density function of X. What's next?
Well, the thing you know has an x in it, but the thing you're trying to get to doesn't... and the thing you're trying to get to has an F in it, but the thing you know has the derivative of F in it...
 
  • #3
This works for discrete, cont., and mixed. Though the derivative statement applies for cont. only. Use integration by parts for cont. case.
 
  • #4
But for discrete random variable X, would it still make sense to talk about "integration"? (i.e.INTEGRAL(0 to infinity) of (1-F(x))dx) Or should it be replaced by a (sigma) sum?

Do you mean using integration by parts for the expression of the definition of E(X)? What should I let u and dv be?

Thanks!
 
  • #5
No, in the discrete case you would be using sums instead of integrals since expectation is defined in terms of a sum not integrals. Well if u = 1 - F(x) then du = -f'(x) dx and dv = dx then v = x. So now you have x*S(x) (evalulated between your limits) + integral(x*f(x) dx). Obviously the first part of your sum vanishes since at "infinity" S(x) -> 0 and at 0, x*S(x) = 0. And so now you are left with what your usual definition of E(X).

Note: this is a very handwavy proof as you would really want to be rigorous when talking about the limits that make the first term vanish.
 
  • #6
kingwinner said:
But for discrete random variable X, would it still make sense to talk about "integration"? (i.e.INTEGRAL(0 to infinity) of (1-F(x))dx) Or should it be replaced by a (sigma) sum?
It does when you learn measure theory. Until then, just replace it with a sum without thinking about it.
 
  • #7
kingwinner said:
What should I let u and dv be?
Did you think about that question at all? I practically told you what u and dv should be in post #2...
 
  • #8
NoMoreExams said:
No, in the discrete case you would be using sums instead of integrals since expectation is defined in terms of a sum not integrals. Well if u = 1 - F(x) then du = -f'(x) dx and dv = dx then v = x. So now you have x*S(x) (evalulated between your limits) + integral(x*f(x) dx). Obviously the first part of your sum vanishes since at "infinity" S(x) -> 0 and at 0, x*S(x) = 0. And so now you are left with what your usual definition of E(X).

Note: this is a very handwavy proof as you would really want to be rigorous when talking about the limits that make the first term vanish.
Just one point I am having troubles with: (in red)

lim x(1-F(x))
x->inf
This actually gives "infinity times 0" which is an indeterminate form and requires L'Hopital's Rule. I tried many different ways but was still unable to figure out what the limit is going to be...how can we prove that the limit is equal to 0?

Thanks!
 
  • #9
Hurkyl, you don't "need" measure theory to write a sum as an integral. The Riemann-Stieljes integral will do that.
 
  • #10
Another way to do it is to write the expected value as

[tex]E[X]=\int_{0}^{\infty}sf(s)ds = \int_{s=0}^{\infty}\int_{x=0}^{s}f(s)dxds[/tex]

and then change the order of the integrals to get your formula. To see what the new bounds on the integrals would be, draw a picture of the region of integration. You can use this same approach to find that

[tex]E[X^2] = \int_{0}^{\infty}s^2f(s)ds = \int_{s=0}^{\infty}\int_{x=0}^{s}2xf(s)dxds=
\int_{0}^{\infty}2x(1-F(x))dx
[/tex]

which is also valid for X nonnegative.
 
  • #11
NoMoreExams said:
No, in the discrete case you would be using sums instead of integrals since expectation is defined in terms of a sum not integrals.
Or, equivalently, write the probability distribution as a sum of delta functions.
 
  • #12
Integration by parts:

[tex]m_X=\int^{\infty}_{0} x f_X(x) dx = -\int^{\infty}_{0} x (-f_X(x) dx)[/tex] eq(1)

Let [tex] u=x [/tex] and [tex]dv = -f_X(x) dx [/tex]

Thus [tex] du=dx [/tex] and [tex] v = 1-F_X(x) [/tex]

Chech that [tex] dv/dx = d/dx (1-F_X(x)) = d/dx(-F_X(x)) = -f_X(x) [/tex] o.k.

Then substitute in (1)

[tex]m_X=-[uv|^{\infty}_{0}-\int^{\infty}_{0}vdu] [/tex]

[tex]m_X=-[x[1-F_X(x)]|^{\infty}_{0}]+\int^{\infty}_{0}[1-F_X(x)]dx [/tex]

The first term is zero at x = 0. As [tex] x\rightarrow\infty, 1-F_X(x)[/tex] tends to zero faster than the increase of [tex] x [/tex] and thus [tex] x[1-F_X(x)]\rightarrow0 [/tex]

Therefore

[tex]m_X=\int^{\infty}_{0}[1-F_X(x)]dx [/tex]

QED

Enjoy!
 
  • #13
[tex]
\begin{align*}
E[X] &= E\bigg[\int_0^X 1\,dx\bigg]\\
&= E\bigg[\int_0^\infty 1_{\{X>x\}}\,dx\bigg]\\
&= \int_0^\infty E[1_{\{X>x\}}]\,dx\\
&= \int_0^\infty P(X > x)\,dx\\
&= \int_0^\infty (1 - F(x))\,dx
\end{align*}
[/tex]

By the way, this formula is true no matter what kind of random variable X is, and we do not need anything more than freshman calculus to understand the integral on the right-hand side. (We need neither measure theory nor Stieltjes integrals.) Even when X is discrete, the function 1 - F(x) is still at least piecewise continuous, so the integral makes perfectly good sense, even when understood as a good old-fashioned Riemann integral.
 
Last edited:

FAQ: Statistics: E(X) = Integral(0 to infinity) of (1-F(x))dx

1. What is the significance of the equation E(X) = Integral(0 to infinity) of (1-F(x))dx in statistics?

The equation E(X) = Integral(0 to infinity) of (1-F(x))dx is known as the expected value or mean of a continuous random variable X. It is used to calculate the average value of X, taking into account all possible values of X and their corresponding probabilities. This equation is important in statistics as it allows us to summarize and understand the distribution of a continuous random variable.

2. How is the expected value calculated using the equation E(X) = Integral(0 to infinity) of (1-F(x))dx?

The equation E(X) = Integral(0 to infinity) of (1-F(x))dx involves taking the integral of the function (1-F(x)) over the entire range of X, from 0 to infinity. This involves finding the area under the curve of the function (1-F(x)), weighted by the probability of each value of X. This weighted average gives us the expected value of X.

3. Can the equation E(X) = Integral(0 to infinity) of (1-F(x))dx be used for any type of data?

Yes, the equation E(X) = Integral(0 to infinity) of (1-F(x))dx can be used for any continuous random variable X, regardless of the type of data. This includes data from experiments, surveys, or natural phenomena.

4. How is the equation E(X) = Integral(0 to infinity) of (1-F(x))dx related to the concept of probability density functions?

The equation E(X) = Integral(0 to infinity) of (1-F(x))dx is closely related to the concept of probability density functions (PDFs). PDFs are used to describe the probability distribution of a continuous random variable. In the equation, (1-F(x)) represents the PDF of X, and taking the integral over the entire range of X gives us the expected value of X.

5. What other statistical measures can be calculated using the equation E(X) = Integral(0 to infinity) of (1-F(x))dx?

Besides the expected value, the equation E(X) = Integral(0 to infinity) of (1-F(x))dx can also be used to calculate other statistical measures such as the variance and standard deviation of a continuous random variable. These measures provide information about the spread or variability of the data around the expected value.

Back
Top