Negative binomial transformation and mgf

In summary, the negative binomial distribution is a discrete probability distribution used to model overdispersed count data, while the negative binomial transformation is a mathematical transformation used to stabilize the variance of the data for statistical analysis. The moment generating function (MGF) describes the distribution of a random variable in terms of its moments and is closely related to the negative binomial transformation. Some common applications of these concepts include modeling population data, predicting customer arrivals, and analyzing equipment failures.
  • #1
Mogarrr
120
6
Imo, this problem is crazy hard.

Homework Statement



Let X have the negative binomial distribution with pmf:

[itex] f_X(x) = \binom{r+x-1}{x}p^{r}(1-p)^{x}[/itex], x=0.1.2...,

where [itex]0<p<1[/itex] and r is a positive integer.
(a) Calculate the mgf (moment generating function) of X.

(b) Define a new random variable by Y=2pX. Show that as [itex]p \downarrow 0[/itex], the mgf of Y converges to that of a chi squared random variable with 2r degrees of freedom by showing that

[itex] lim_{p \to 0} M_Y(t) = (\frac 1{1-2t})^{r}, |t|< \frac 12[/itex].


Homework Equations



the moment generating function for a discrete random variable, denoted as [itex]M_X(t)[/itex] is equal to

[itex] \sum_x e^{tx}f_X(x) [/itex]

The Attempt at a Solution


So I think I've figured out part (a), but I'm stuck on part (b).

For part (a) the mgf is

[itex]\sum_{x=0}^{\infty} e^{tx} \binom{r+x-1}{x}p^{r}(1-p)^{x} = \sum_{x=0}^{\infty} \binom{r+x-1}{x}p^{r}(e^{t}(1-p))^{x} [/itex].

I would think that...

[itex]\sum_{x=0}^{\infty} \binom{r+x-1}{x}(1-e^{t}(1-p))^{r}(e^{t}(1-p))^{x} = 1[/itex],

since this is another negative binomial pmf (probability mass function), whose sum must be 1.

So now I do my sneaky trick...

[itex] \frac {(1-(1-p)e^{t})^{r}}{(1-(1-p)e^{t})^{r}} \frac {p^{r}}{p^{r}} \sum_{x=0}^{\infty} \binom{r+x-1}{x}p^{r}(e^{t}(1-p))^{x} = \frac {p^{r}}{(1-(1-p)e^{t})^{r}} \cdot \sum_{x=0}^{\infty} \binom{r+x-1}{x}(1-e^{t}(1-p))^{r}(e^{t}(1-p))^{x} =
\frac {p^{r}}{(1-(1-p)e^{t})^{r}}[/itex].

So if this is correct, that takes care of part (a).

For part (b) I have [itex] Y = 2pX [/itex], so [itex] y = 2px \iff x = \frac {y}{2p} [/itex]. I think this is a bijection, so I have

[itex] f_Y(y) = \sum_{x \in g^{-1}(y)} f_X(x) = f_X(\frac {y}{2p}) [/itex].

I'm not too sure about this transformation, but continuing on, I have mgf of Y is

[itex] \sum_y e^{ty}f_Y(y) = \sum_y e^{yt} \binom {r + \frac {y}{2p} - 1}{\frac {y}{2p}}p^{r}(1-p)^{\frac {y}{2p}} [/itex]...

I could continue on, but the trick I used earlier doesn't seem to work on this summation. Please help.
 
Physics news on Phys.org
  • #2
Mogarrr said:
Imo, this problem is crazy hard.

Homework Statement



Let X have the negative binomial distribution with pmf:

[itex] f_X(x) = \binom{r+x-1}{x}p^{r}(1-p)^{x}[/itex], x=0.1.2...,

where [itex]0<p<1[/itex] and r is a positive integer.
(a) Calculate the mgf (moment generating function) of X.

(b) Define a new random variable by Y=2pX. Show that as [itex]p \downarrow 0[/itex], the mgf of Y converges to that of a chi squared random variable with 2r degrees of freedom by showing that

[itex] lim_{p \to 0} M_Y(t) = (\frac 1{1-2t})^{r}, |t|< \frac 12[/itex].


Homework Equations



the moment generating function for a discrete random variable, denoted as [itex]M_X(t)[/itex] is equal to

[itex] \sum_x e^{tx}f_X(x) [/itex]

The Attempt at a Solution


So I think I've figured out part (a), but I'm stuck on part (b).

For part (a) the mgf is

[itex]\sum_{x=0}^{\infty} e^{tx} \binom{r+x-1}{x}p^{r}(1-p)^{x} = \sum_{x=0}^{\infty} \binom{r+x-1}{x}p^{r}(e^{t}(1-p))^{x} [/itex].

I would think that...

[itex]\sum_{x=0}^{\infty} \binom{r+x-1}{x}(1-e^{t}(1-p))^{r}(e^{t}(1-p))^{x} = 1[/itex],

since this is another negative binomial pmf (probability mass function), whose sum must be 1.

So now I do my sneaky trick...

[itex] \frac {(1-(1-p)e^{t})^{r}}{(1-(1-p)e^{t})^{r}} \frac {p^{r}}{p^{r}} \sum_{x=0}^{\infty} \binom{r+x-1}{x}p^{r}(e^{t}(1-p))^{x} = \frac {p^{r}}{(1-(1-p)e^{t})^{r}} \cdot \sum_{x=0}^{\infty} \binom{r+x-1}{x}(1-e^{t}(1-p))^{r}(e^{t}(1-p))^{x} =
\frac {p^{r}}{(1-(1-p)e^{t})^{r}}[/itex].

So if this is correct, that takes care of part (a).

For part (b) I have [itex] Y = 2pX [/itex], so [itex] y = 2px \iff x = \frac {y}{2p} [/itex]. I think this is a bijection, so I have

[itex] f_Y(y) = \sum_{x \in g^{-1}(y)} f_X(x) = f_X(\frac {y}{2p}) [/itex].

I'm not too sure about this transformation, but continuing on, I have mgf of Y is

[itex] \sum_y e^{ty}f_Y(y) = \sum_y e^{yt} \binom {r + \frac {y}{2p} - 1}{\frac {y}{2p}}p^{r}(1-p)^{\frac {y}{2p}} [/itex]...

I could continue on, but the trick I used earlier doesn't seem to work on this summation. Please help.

The mgf of ##Y = 2pX## is ##M_Y(t) = E e^{tY} = E e^{2pt X} = M_X(2pt)##.
 
  • Like
Likes 1 person
  • #3
Ray Vickson said:
The mgf of ##Y = 2pX## is ##M_Y(t) = E e^{tY} = E e^{2pt X} = M_X(2pt)##.

That seems simple.

So [itex]M_X(2pt) = (\frac {p}{1-(1-p)e^{2pt}})^{r}[/itex], right?
 
  • #4
Ok, then taking the limit as p approaches 0, and ignoring the exponent r, for now, I have...

[itex] lim_{p \to 0} \frac {p}{(1-(1-p)e^{2pt})} = lim_{p \to 0} \frac {p}{(1-e^{2pt}+pe^{2pt})} = \frac 00 [/itex].

Applying L'Hospital's rule, I have...

[itex] lim_{p \to 0} \frac {\frac {d}{dp}}{\frac {d}{dp}}\frac {p}{(1-e^{2pt}+pe^{2pt})} = lim_{p \to 0} \frac 1{(-2te^{2pt}+e^{2pt}+pe^{2pt})} = \frac 1{(1-2t)}[/itex].

Putting everything together I have...

[itex] lim_{p \to 0} M_Y(t) = (\frac 1{(1-2t)})^{r} [/itex].

Is it legit to ignore the exponent while taking the limit?
 
  • #5
Yes, why not? The function f(w) = w^r is continuous in w for fixed integer r > 0.
 
Last edited:

FAQ: Negative binomial transformation and mgf

What is the negative binomial distribution?

The negative binomial distribution is a discrete probability distribution that describes the number of successes in a sequence of independent and identically distributed Bernoulli trials (i.e. trials with a binary outcome of success or failure) before a specified number of failures is reached. It is often used to model overdispersed count data, where the variance is larger than the mean.

What is the negative binomial transformation?

The negative binomial transformation is a mathematical transformation that can be applied to a set of data that follows a negative binomial distribution. It is used to stabilize the variance of the data, making it more suitable for statistical analysis. The transformation involves taking the natural logarithm of the data and then adjusting for a constant factor, which is often the mean of the data.

What is the moment generating function (MGF) of the negative binomial distribution?

The moment generating function (MGF) of the negative binomial distribution is a mathematical function that describes the distribution of a random variable in terms of its moments (i.e. mean, variance, skewness, etc.). For the negative binomial distribution, the MGF is given by M(t) = ((1-p)e^t / (1-pe^t))^r, where p is the probability of success, r is the number of failures, and t is a constant.

How is the negative binomial transformation related to the moment generating function (MGF)?

The negative binomial transformation and the moment generating function (MGF) are closely related. The MGF of the transformed data is equal to the MGF of the original data multiplied by the constant factor used in the transformation. This relationship allows for the calculation of the MGF for the transformed data, which can then be used to derive important statistical properties such as the mean and variance.

What are some common applications of the negative binomial distribution and transformation?

The negative binomial distribution and transformation have many applications in various fields, including biology, economics, and engineering. Some common applications include modeling the number of organisms in a population, predicting the number of customer arrivals at a service desk, and analyzing the number of equipment failures in a manufacturing process. The transformation is also used in statistical tests to compare groups of data with overdispersed count data, such as in a negative binomial regression analysis.

Back
Top