Moment Generating Function (proof of definition)

In summary: The expected value of any function u(x) is defined to be[tex]E(u(x))= \int_{-\infty}^\infty u(x)f(x)dx[/itex]
  • #1
Oxymoron
870
0

Homework Statement


Prove that for a random variable [tex]X[/tex] with continuous probability distribution function [tex]f_X(x)[/tex] that the Moment Generating Function, defined as

[tex]
M_X(t) := E[e^{tX}]
[/tex]

is

[tex]
M_X(t) = \int_x^{\infty}e^{tx}f_X(x)dx
[/tex]

Homework Equations



Above and

[tex]
E[X] = \int_{-\infty}^{\infty}xf_X(x)dx
[/tex]

The Attempt at a Solution



This expression is given in so many textbooks and the ones that I have read all skip over this derivation. I want to be able to prove (1) to myself.

Proof:
Write the exponential function as a Maclaurin series:

[tex]
M_X(t) = E[e^{tX}]
[/tex]

[tex]
= E[1+tX+\frac{t^2}{2!}X^2+\frac{t^3}{3!}X^3+...]
[/tex]

Since [tex]E[1] = 1[/tex] and the [tex]E[t^n/n!]=t^n/n![/tex] because they are constant and the expectation of a constant is itself you get:

[tex]
= 1+tE[X]+\frac{t^2}{2!}E[X^2]+\frac{t^3}{3!}E[X^3]+...
[/tex]...also using the linearity of E. Now, writing the series as a sum:

[tex]
=\sum_{t=0}^{\infty}\frac{t^n}{n!}E[X^n]
[/tex]

And extracting the exponential:

[tex]
=e^t\sum_{n=0}^{\infty}E[X^n]
[/tex]

Now I am stuck! I know that I am meant to use

[tex]
E[X] = \int_{-\infty}^{\infty}xf_X(x)dx
[/tex]

but I have [tex]E[X^n][/tex] and I also have [tex]e^t[/tex] and not [tex]e^{tx}[/tex].
 
Last edited:
Physics news on Phys.org
  • #2
There isn't really much to prove. For any continuous probability distribution with density function f(x), the Expected value of any function u(x) is defined to be
[tex]E(u(x))= \int_{-\infty}^\infty u(x)f(x)dx[/itex]

Replace [itex]u(x)[/itex] with [itex]e^{tx}[/itex] and you have it. It is true that the whole point of the "moment generating function" is that the coefficients of the powers of x in a power series expansion are the "moments" of the probability distribution, but that doesn't seem to me to be relevant to this question. I see no reason to write its Taylor series.
 
  • #3
Good, okay that makes sense.

Then I suppose all I had to do was prove

[tex]
E(u(x))= \int_{-\infty}^\infty u(x)f(x)dx
[/tex]

and then substitute [tex]u(x)[/tex] with [tex]e^{tx}[/tex] as you said and I'm done.

But once again there is nothing to prove because it is a definition.
 

FAQ: Moment Generating Function (proof of definition)

1. What is a moment generating function (MGF)?

A moment generating function is a mathematical function that is used to uniquely determine the probability distribution of a random variable. It is defined as the expected value of e^tx, where t is a real number and x is the random variable.

2. What is the proof of the definition of a moment generating function?

The proof of the definition of a moment generating function involves using the Taylor series expansion of e^tx and then taking the expected value. This results in a series of moments of the random variable, which can then be used to determine the probability distribution.

3. Why is the moment generating function useful?

The moment generating function is useful because it allows for the characterization of a probability distribution in terms of its moments, which can provide valuable information about the distribution such as its mean, variance, and higher order moments. It also allows for the calculation of probabilities and moments for more complex distributions.

4. What is the relationship between the moment generating function and the characteristic function?

The moment generating function and the characteristic function are both used to characterize probability distributions. The moment generating function is the expected value of e^tx, while the characteristic function is the expected value of e^(itx), where i is the imaginary unit. The characteristic function is related to the moment generating function by a Fourier transform.

5. How is the moment generating function used in statistical inference?

The moment generating function is used in statistical inference to estimate parameters of a probability distribution. It can also be used to perform hypothesis testing and construct confidence intervals. Additionally, the moment generating function can be used to generate random numbers from a given distribution, which is useful in simulations and modeling.

Similar threads

Replies
7
Views
849
Replies
3
Views
1K
Replies
1
Views
1K
Replies
1
Views
1K
Replies
7
Views
1K
Replies
13
Views
1K
Back
Top