Finding E[X^2] from a given random variable with distinct probability

In summary, to find the variance of a random variable, you must first compute the expected value of X^2, which is equal to the sum of each value of X squared multiplied by its corresponding probability. Then, you must also compute the square of the expected value of X, which is equal to the sum of each value of X multiplied by its corresponding probability. Finally, the variance can be calculated by subtracting the square of the expected value of X from the expected value of X^2.
  • #1
blah900
4
0

Homework Statement


Z is a random variable.
P(X=a) = p1
P(X=b) = p2
P(X=c) = p3
P(X=d) = p4

Find the variance.

Homework Equations


Var(X) = E(X2) - E(X)2

The Attempt at a Solution



Okay so for the E(X2), I am currently very confused.
My professor gave us this formula where
[tex]E(X^2) = \sum^{n}_{i=1}E(X_i^2) + \sum^{n}_{i{\neq}j}E(X_iX_j) [/tex]

The way I know it would just be the first portion and thus would become:
[tex]a^2p_1+b^2p_2+c^2p_3+d^2p_4[/tex]
However, doing it the professors way would become
[tex]a^2p_1+b^2p_2+c^2p_3+d^2p_4 + ab(2p_1p_2) + ac(2p_1p_3) + ad(2p_1p_4) + bc(2p_2p_3) + bd(2p_2p_4) + cd(2p_3p_4)[/tex]

Which one is right? And when do I know when to use which case?
 
Last edited:
Physics news on Phys.org
  • #2
My professor gave us this formula where
[tex]E(X^2) = \sum^{n}_{i=1}X_i^2 + \sum^{n}_{i{\neq}j}X_iX_j [/tex]

Are you sure that you wrote that formula correctly?

I think two different concepts have been discussed. There is one kind of formula for the variance of a sample. We think of samples being created when a random variable takes on specific values and produces a set of data. When you compute the variance (or mean, or mode, or any other statistic) from sample values, you don't multiply the values by any probabilities. The number you compute for the sample variance is often used to estimate the variance of the random variable, but it is not always equal to it. It's just a matter of chance what specific numbers are in the data and what their variance is. To distinguish between the two types of variances, one is called a "sample variance" and the other can be called "the variance of the random variable" or "the population variance".

What you call "the professor's formula" looks like it computes something from a sample. I don't think the formula as you wrote it is correct.




The way I know it would just be the first portion and thus would become:
[tex]a^2p_1+b^2p_2+c^2p_3+d^2p_4[/tex]

We are dealing with a random variable instead of a sample, so you are correct to multiply values times a probability. You correctly computed the expected value of [itex] X^2 [/itex]

To compute the variance you must also compute the second term in [itex] Var(X) = E(X^2) - E(X)^2 [/itex] (which is a correct formula). The second term is the square of the expected value of [itex] X [/itex].



However, doing it the professors way would become
[tex]a^2p_1+b^2p_2+c^2p_3+d^2p_4 + ab(2p_1p_2) + ac(2p_1p_3) + ad(2p_1p_4) + bc(2p_2p_3) + bd(2p_2p_4) + cd(2p_3p_4)[/tex]

It wouldn't be that way because what you called the professors way didn't say to multiply the probabilities times the X values.
 
Last edited:
  • #3
Stephen Tashi said:
Are you sure that you wrote that formula correctly?

I think two different concepts have been discussed. There is one kind of formula for the variance of a sample. We think of samples being created when a random variable takes on specific values and produces a set of data. When you compute the variance (or mean, or mode, or any other statistic) from sample values, you don't multiply the values by any probabilities. The number you compute for the sample variance is often used to estimate the variance of the random variable, but it is not always equal to it. It's just a matter of chance what specific numbers are in the data and what their variance is. To distinguish between the two types of variances, one is called a "sample variance" and the other can be called "the variance of the random variable" or "the population variance".

What you call "the professor's formula" looks like it computes something from a sample. I don't think the formula as you wrote it is correct.






We are dealing with a random variable instead of a sample, so you are correct to multiply values times a probability. You correctly computed the expected value of [itex] X^2 [/itex]

To compute the variance you must also compute the second term in [itex] Var(X) = E(X^2) - E(X)^2 [/itex] (which is a correct formula). The second term is the square of the expected value of [itex] X [/itex].





It wouldn't be that way because what you called the professors way didn't say to multiply the probabilities times the X values.

Oops my bad. It was supposed to be E of the following values. But, yeah, I guess the way I know would work right?
 
  • #4
Yes it would work. How did you compute [itex] E(X)^2 [/itex] ?
 

FAQ: Finding E[X^2] from a given random variable with distinct probability

How do you find the expected value of X^2 from a given random variable?

The expected value of X^2 can be found by multiplying each possible value of X^2 by its corresponding probability and then summing all of these values together. This can be represented by the formula E[X^2] = Σ(x^2 * P(X=x)).

What is the purpose of finding the expected value of X^2?

The expected value of X^2 is a measure of the average value of X^2 that we would expect to see if we were to repeat the random variable experiment a large number of times. It is an important tool in probability and statistics, as it helps us understand the behavior of a random variable.

Can the expected value of X^2 be negative?

Yes, the expected value of X^2 can be negative if the random variable has some negative values and their probabilities are high enough to outweigh any positive values. However, in most cases, the expected value of X^2 is a positive number as it is the sum of squared values.

How is finding the expected value of X^2 different from finding the expected value of X?

The expected value of X is the sum of all possible values of X multiplied by their corresponding probabilities. On the other hand, the expected value of X^2 is the sum of all possible values of X^2 multiplied by their corresponding probabilities. Essentially, finding the expected value of X^2 is one step further in calculating the average value of a random variable.

What is the relationship between the variance and the expected value of X^2?

The variance (Var(X)) is equal to the expected value of X^2 minus the squared expected value of X (E[X]^2). This relationship is represented by the formula Var(X) = E[X^2] - E[X]^2. Essentially, the variance is a measure of how much the values of X vary from its expected value, which can be calculated using the expected value of X^2.

Back
Top