Sums of independent random variables

In summary, the conversation discusses finding the probability distribution of Z, which is the sum of exponential random variables X1 to XN. The distribution of N, the number of variables, is geometric with parameter p. All variables are independent. The MGF of Z is derived using the MGFs of N and X, and is equivalent to an exponential distribution with parameter (1-p)λ. It is not necessary to find the moments of Z.
  • #1
Jason4
28
0
I have:

$Z=X_1+\ldots+X_N$, where:

$X_i\sim_{iid}\,\text{Exponential}(\lambda)$

$N\sim\,\text{Geometric}_1(p)$

For all $i,\,N$ and $X_i$ are independent.

I need to find the probability distribution of $Z$:

$G_N(t)=\frac{(1-p)t}{1-pt}$

$M_X(t)=\frac{\lambda}{\lambda-t}$

$M_Z(z)=G_N(M_X(z))=\frac{(1-p)\left(\frac{\lambda}{\lambda-z}\right)}{1-p\left(\frac{ \lambda}{\lambda-z}\right)}$

$\Rightarrow Z\sim\,\text{Geometric}_1\left(p \frac{ \lambda}{\lambda-z}\right)$

Is that even correct? Should I be looking for $E[Z]$ and $V[Z]$ ?
 
Physics news on Phys.org
  • #2
Hello,

Try to think about it. You're summing exponential rv's, which are continuous. Thus your final distribution should be continuous. So it can't be a geometric distribution. Your mgf is indeed correct, but you need to transform it in order to get a known mgf.

In particular, we have :
$M_Z(z)=\frac{(1-p)\left(\tfrac{\lambda}{\lambda-z}\right)}{1-p\left(\tfrac{\lambda}{\lambda-z}\right)}=\frac{(1-p)\lambda}{(1-p)\lambda-z}$
which is the MGF of an exponential distribution with parameter $(1-p)\lambda$And when you're asked for the distribution of a rv, there's no need of the moments (unless it's a normal distribution), because it doesn't characterize the distribution. A MGF or a PGF are functions that determine the distribution of a rv, so they're fully sufficient.
 

FAQ: Sums of independent random variables

What are "Sums of Independent Random Variables"?

"Sums of independent random variables" refer to the mathematical concept of adding together multiple random variables that are statistically independent of each other. This is commonly used in statistical analysis and probability theory.

What is the significance of studying "Sums of Independent Random Variables"?

Studying "Sums of Independent Random Variables" allows us to better understand the behavior and properties of complex systems and processes. It also has practical applications in fields such as economics, finance, and engineering.

How are "Sums of Independent Random Variables" calculated?

To calculate the sum of independent random variables, we simply add together the individual random variables. However, the properties and characteristics of the resulting sum may differ from those of the individual variables.

What is the difference between "Sums of Independent Random Variables" and "Sums of Dependent Random Variables"?

The main difference between these two concepts is that "Sums of Independent Random Variables" involve adding together variables that are statistically independent, while "Sums of Dependent Random Variables" involve adding together variables that are not independent and may have some relationship or influence on each other.

What are some real-world examples of "Sums of Independent Random Variables"?

One example is the sum of independent coin flips, where each flip has a probability of success or failure that is independent of the other flips. Another example is the sum of independent stock returns, where the returns of different stocks are statistically independent of each other.

Back
Top