Calculating Expected Value using Probability Mass Function for Random Variable X

In summary, the conversation discusses a homework problem involving a random variable X and its probability mass function. The goal is to show that the expected value of X is equal to the sum of infinity of P(X greater than and equal to i). The conversation includes various attempts at solving the problem, such as expanding the probability mass function and using the formula for E[X]. Ultimately, the conversation ends with a request for help in converting the given equation into the desired form.
  • #1
playboy
Hello...

hmm.. i am working on a homework problem, and I am kina stuck.

the question reads: Suppose that X is a random variable which can take on any non-negative integer (including 0). Write P(X greater than and equal to i) in terms of the probability mass function of X and hence show that

E[X] = the sum of infinity, i = 1 P(X greater than and equal to i)

I tried to solve this problem by just exanding it i times.

For example, i suppose i = 0, 1, 2, 3, 4 ...

So the probability mass funtion would look like:

P(1) = P{X = 1}
P(2) = P{X = 2}
P(3) = P{X = 3}
P(4) = P{X = 4}

i times.. etc.

but getting E[X] has got be completely lost :bugeye:

I thought perhaps that E[X] = 1P(X = 1) + 2P(X = 2) + 3P(X = 3) ... but what are the values of the mass function?

Anybody have an idea?
 
Physics news on Phys.org
  • #2
Anyways, i got MUCH further than before, but still not quite their.

P(X >/= i) = 1 - P(X=0) - P(X=1) - ... - P(X=(i-1)) for i = 0,1,2,3,...

and so E[x] = -1 - 0(P(X=0)) - 1(P(X=1)) - ... - (i-1)P(X=(i-1))

and then the sum of infinity at n = 1 is p(i)P[X>/=i]

This is where i get stuck.. i don't know how to convert/show "the sum of
infinity at n = 1 is p(i)P[X/>=i] " is equal to "the sum of infinity at
n = 1 is P[X>/=i] "

I know the above is all messy... and I think I almost on the got it..
but can anybody help me out?
 

FAQ: Calculating Expected Value using Probability Mass Function for Random Variable X

What is a probability mass function (PMF)?

A probability mass function is a mathematical function that describes the probability distribution of a discrete random variable. It maps each possible outcome of the random variable to its corresponding probability.

How is the expected value calculated using a PMF?

The expected value, or mean, of a random variable X is calculated by summing the product of each possible outcome of X and its corresponding probability, as given by the PMF. In mathematical notation, it is written as E(X) = ∑ x * P(X=x).

What information can be obtained from the expected value?

The expected value provides a measure of the central tendency of a random variable. It represents the average value that would be obtained if the random variable is repeatedly observed over a large number of trials. It can also be used to make predictions about the outcomes of future trials.

How does the expected value change with different distributions of the PMF?

The expected value can vary depending on the shape of the PMF. For example, a symmetric PMF will have an expected value at its center, while a skewed PMF will have an expected value shifted towards the tail of the distribution. Additionally, the expected value can be influenced by the range and probability of each possible outcome.

Can the PMF be used to calculate the expected value for continuous random variables?

No, the PMF is only applicable to discrete random variables. For continuous random variables, the expected value is calculated using the probability density function (PDF) instead. The PDF is the continuous equivalent of the PMF and describes the probability distribution of a continuous random variable.

Back
Top