Finding probability using moment-generating functions

In summary, the conversation is about a question from Schaum's Outline of Probability, Random Variables, and Random Processes, specifically question 4.60 part (b). The question involves finding P(X=0) and P(X=1) using the moment-generating function and the definition for discrete random variables. After some discussion, it is determined that the sample space for the random variable X is just 0 and 1, making it a Bernoulli trial.
  • #1
brogrammer
2
0
I'm working through Schaum's Outline of Probability, Random Variables, and Random Processes, and am stuck on a question about moment-generating functions. If anyone has the 2nd edition, it is question 4.60, part (b).

The question gives the following initial information: [itex]E[X^k]=0.8[/itex] for k = 1, 2, ... and the moment generating function is: [itex]0.2+0.8\sum_{k=0}^{\infty}\frac{t^k}{k!}=0.2+0.8e^t[/itex].

The question is asking to find [itex]P(X=0)[/itex] and [itex]P(X=1)[/itex]. I'm trying to do the first part and solve [itex]P(X=0)[/itex]. By the definition of a moment-generating function for discrete random variables, I know I can use the following equation:

[itex]\sum_{i}e^{tx_i}p_X(x_i)=0.2+0.8e^t[/itex]

For [itex]P(X=0)[/itex], the above equation becomes: [itex]e^{t(0)}p_X(0)=0.2+0.8e^t[/itex]. The LHS simplifies to [itex]p_X(0)[/itex] which means [itex]P(X=0)=0.2+0.8e^t[/itex]. But I know that is not the right answer. The right answer is [itex]P(X=0)=0.2[/itex].

Can someone please show me where I'm going wrong? Thanks in advance for your help.
 
Physics news on Phys.org
  • #2
Hey brogrammer and welcome to the forums.

If you expand the sum and plug in the values for i you will get e^(0t)P(X=0) + e^(1t)P(X=1) = 0.2 + 0.8e^t = P(X=0) + e^(t)*P(X=1)

Now can you equate like terms with co-effecients?
 
  • #3
Chiro -

Thanks for the reply. That makes sense. My thick brain didn't realize that the question wants me to see that the sample space for the r.v. X is just 0 and 1, i.e. a Bernoulli trial. Now it makes sense.

Thanks man.
 

FAQ: Finding probability using moment-generating functions

1. What is a moment-generating function (MGF)?

A moment-generating function is a mathematical function that helps us find the probability distribution of a random variable by generating its moments. It is a useful tool in statistics and probability theory as it allows us to easily calculate probabilities and moments without having to use complex equations.

2. How do you find the MGF of a random variable?

The MGF of a random variable can be found by taking the expected value of the random variable raised to different powers and using the formula M(t) = E(etx). This function can then be used to find the moments and probability distribution of the random variable.

3. Can the MGF be used to find probabilities for any type of random variable?

Yes, the MGF can be used for any type of random variable, including discrete and continuous variables. However, the MGF may not exist for certain types of distributions, such as those with heavy tails or infinite moments.

4. How does the MGF help us find probabilities?

The MGF allows us to find the moments of a random variable, which in turn can be used to find the mean, variance, and other important parameters of the probability distribution. These parameters can then be used to calculate probabilities for different events or outcomes related to the random variable.

5. Are there any limitations to using MGFs to find probabilities?

While MGFs are a powerful tool in finding probabilities, they may not always be the most efficient method. In some cases, it may be easier to use other techniques, such as the characteristic function or moment-generating function generating function, to find probabilities. Additionally, MGFs may not always exist for all types of random variables, as mentioned earlier.

Similar threads

Back
Top