Is the Sum of Exponential Random Variables a Gamma Distribution?

  • MHB
  • Thread starter Chris L T521
  • Start date
In summary, an exponential random variable is a type of continuous probability distribution with a single parameter that models the time between events in a Poisson process. A Gamma distribution, with two parameters, is used to model the time to failure of a system, waiting time between events in a Poisson process, and the sum of exponential random variables. The sum of exponential random variables can be represented by a Gamma distribution, which allows for easier computation and modeling of systems with multiple failure modes. However, the Gamma distribution may not accurately represent the sum of exponential random variables in all cases, especially if the variables being summed have different rates or are highly correlated.
  • #1
Chris L T521
Gold Member
MHB
915
0
Thanks again to those that participated in the second round of our POTW! Now, it's time for the third one! (Bigsmile)

This week's problem was proposed by yours truly.

-----

Problem: Let $X_i,\, (i=1,\ldots,n)$ be a (continuous) random variable of the exponential distribution $\text{Exp}(\lambda)$, where it's probability density function (p.d.f.) is defined by

\[f(x) = \left\{\begin{array}{cl}\lambda e^{-\lambda x} & x\geq 0,\,\lambda >0\\ 0 & x<0\end{array}\right.\]

Show that $\sum_{i=1}^n X_i$ is equivalent to a random variable of the Gamma distribution $\Gamma(n,\theta)$, where the p.d.f. of the Gamma distribution is given by

\[f(x) = \left\{\begin{array}{cl}\frac{1}{\theta^n\Gamma(n)}x^{n-1}e^{-x/\theta} & x\geq 0,\,\theta>0,\, n\in\mathbb{Z}^+\\ 0 & x<0\end{array}\right.\]

-----

Here are two hints:

If $X$ is a continuous random variable, we define the moment generating function by

\[M_X(t) = E[e^{tX}] = \int_{-\infty}^{\infty}e^{tx}f(x)\,dx\]

where $f(x)$ is the p.d.f. of the random variable $X$. Use the fact that if $\{X_i\}_{i=1}^n$ is a collection of random variables, then

\[M_{\sum_{i=1}^n X_i}(t) = \prod_{i=1}^n M_{X_i}(t)\]

Recall that $\Gamma(x) = \displaystyle\int_0^{\infty}e^{-t}t^{x-1}\,dx$.

Remember to read the http://www.mathhelpboards.com/showthread.php?772-Problem-of-the-Week-(POTW)-Procedure-and-Guidelines to find out how to http://www.mathhelpboards.com/forms.php?do=form&fid=2!

EDIT: I forgot to mention that each $X_i$ are i.i.d. random variables. If they're not, then the above result doesn't hold (thanks to girdav for pointing this out).
 
Last edited:
Physics news on Phys.org
  • #2
Sadly, no one answered the question gave a correct solution this week!

Here's the solution to the problem.

By a simple calculation, we can show that if $X_i\sim \text{Exp}(\lambda)$, then

\[\begin{aligned}M_{X_i}(t) &= \int_0^{\infty}e^{tx}\lambda e^{-\lambda x}\,dx\\ &= \lambda\int_0^{\infty}e^{-(\lambda-t)x}\,dx\\ &= \frac{\lambda}{\lambda-t}\int_0^{\infty}e^{-u}\,du\,\quad \text{(by making the substitution $u=(\lambda-t)x$}\\ &= \frac{\lambda}{\lambda-t}=\frac{1}{1-(t/\lambda)}\end{aligned}\]

Let $X\sim\Gamma(n,\theta)$. Then

\[\begin{aligned}M_X(t) &= \frac{1}{\theta^n\Gamma(n)}\int_0^{\infty}x^{n-1}\exp\left(-\left(\frac{1}{\theta}-t\right)x\right)\,dx\\
&= \frac{1}{\theta^n\Gamma(n)} \int_0^{\infty}\frac{1}{(1/\theta - t)^n}u^{n-1}e^{-u}\,du\quad\text{(use the substitution $u=(1/\theta - t)x$}\\ &= \frac{1}{\theta^n\Gamma(n)(1/\theta- t)^n}\Gamma(n)\quad\text{(by definition of the Gamma function)}\\ &=\frac{1}{\theta^n(1/\theta - t)^n}=\frac{1}{(1-t\theta)^n}.\end{aligned}\]

Thus,

\[\begin{aligned}M_{\sum_{i=1}^n X_i}(t) &= \prod_{i=1}^n M_{X_i}(t)\\ &= \prod_{i=1}^n \frac{1}{1 - (t/\lambda)}\\ &= \frac{1}{(1-(t/\lambda))^n}\end{aligned}\]

If we make the substitution $\frac{1}{\theta}=\lambda$, then we see that

\[M_{\sum_{i=1}^n X_i}(t) = \frac{1}{(1-t\theta)^n} = M_X(t)\]

where $X\sim\Gamma(n,\theta)$.

Therefore, $X=\sum_{i=1}^n X_i.\qquad\blacksquare$
 
Last edited:

FAQ: Is the Sum of Exponential Random Variables a Gamma Distribution?

Can you explain what an exponential random variable is?

An exponential random variable is a type of continuous probability distribution that models the time between events in a Poisson process. It has a single parameter, lambda (λ), which represents the rate of occurrence of the events.

What is the definition of a Gamma distribution?

A Gamma distribution is a continuous probability distribution that is used to model the time to failure of a system, the waiting time between events in a Poisson process, and the sum of exponential random variables. It has two parameters, alpha (α) and beta (β), which control the shape and scale of the distribution, respectively.

How is the sum of exponential random variables related to a Gamma distribution?

The sum of independent exponential random variables with the same rate parameter λ can be represented by a Gamma distribution with shape parameter n and scale parameter λ, where n is the number of variables being summed.

What are the benefits of representing the sum of exponential random variables as a Gamma distribution?

Representing the sum of exponential random variables as a Gamma distribution allows for easier computation of probabilities and other statistical measures. It also provides a convenient way to model systems with multiple failure modes or events that occur at different rates.

Are there any limitations to using a Gamma distribution to represent the sum of exponential random variables?

While the Gamma distribution is a useful approximation, it may not accurately represent the sum of exponential random variables in all cases. This is especially true if the variables being summed have different rates or are highly correlated. In these cases, alternative methods of modeling may be more appropriate.

Back
Top