Simple Expectation Value Question

In summary: This last part I don't understand:P(f(xi)) = P(xi)I don't see how this can be true for anything other than f(xi)=xi or P(xi)=(a constant). Can someone please justify this to me?
  • #1
Harrisonized
208
0
I was told that given a probability distribution p(x) dx, the expected value for x is given by:

<x> = Ʃ xi P(xi) = ∫ x P(x) dx

This part makes sense to me. It was justified to me through the use of weighted averages. However, my teacher then made a hand-wavy move to generalize the above formula. I quote:

This way of calculating the average can be easily generalized, since it depends neither on the numbers of different events nor on the total number of events, it only depends on the probabilities of all different possibilities. So we can consider an experiment where we are measuring some quantity x, and all the possible outcomes are x1, x2, ... , xn. If we denote the probability of the outcome xi to be P(xi) then we can write the average of x as

<x>= Ʃ xi P(xi) = ∫ x P(x) dx (17)

We may also be interested in calculating the average of some given function of x, call it f(x). The different possible values of f(x) are f(x1), f(x2), ... , f(xn), and the probability P(f(xi)) of the value f(xi) is, of course, the same as for x to have the value xi, i.e. P(xi)

P(f(xi)) = P(xi)

We can now use the rule (17) to find the average of f(x)

<f>= Ʃ f(xi) p(f(xi)) = ∫ f(x) P(f(x)) dx
<f>= Ʃ f(xi) p(xi) ∫ f(x) P(x) dx

-end of quote-

It's this last part I don't understand:

P(f(xi)) = P(xi)

I don't see how this can be true for anything other than f(xi)=xi or P(xi)=(a constant). Can someone please justify this to me?

-----

Edit: after extensive searching, I finally came across this:

http://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician

I've been searching for a long time now, and I still haven't found justification for why this works.
 
Last edited:
Physics news on Phys.org
  • #2
Yeah that's really stupid way of putting it.

The expectation value of a function is just
<f> = Ʃ f(xi) P(xi) though,
right? That is, it's the average of the function weighted by the probability density of the individual events.

Now your teacher is using horrible notation P(f(x_i)) for the weight of each f(x_i) in the sum; in a rigorous sense it's complete nonsense, but his underlying point is correct.
 
  • #3
Harrisonized said:
I was told that given a probability distribution p(x) dx, the expected value for x is given by:

<x> = Ʃ xi P(xi) = ∫ x P(x) dx

This part makes sense to me. It was justified to me through the use of weighted averages. However, my teacher then made a hand-wavy move to generalize the above formula. I quote:

This way of calculating the average can be easily generalized, since it depends neither on the numbers of different events nor on the total number of events, it only depends on the probabilities of all different possibilities. So we can consider an experiment where we are measuring some quantity x, and all the possible outcomes are x1, x2, ... , xn. If we denote the probability of the outcome xi to be P(xi) then we can write the average of x as

<x>= Ʃ xi P(xi) = ∫ x P(x) dx (17)

We may also be interested in calculating the average of some given function of x, call it f(x). The different possible values of f(x) are f(x1), f(x2), ... , f(xn), and the probability P(f(xi)) of the value f(xi) is, of course, the same as for x to have the value xi, i.e. P(xi)

P(f(xi)) = P(xi)

We can now use the rule (17) to find the average of f(x)

<f>= Ʃ f(xi) p(f(xi)) = ∫ f(x) P(f(x)) dx
<f>= Ʃ f(xi) p(xi) ∫ f(x) P(x) dx

-end of quote-

It's this last part I don't understand:

P(f(xi)) = P(xi)

I don't see how this can be true for anything other than f(xi)=xi or P(xi)=(a constant). Can someone please justify this to me?

-----

Edit: after extensive searching, I finally came across this:

http://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician

I've been searching for a long time now, and I still haven't found justification for why this works.

As you have already realized, the result [itex] E f(X) = \sum P(X=x) f(x)[/itex] or [itex] E f(X) = \int f(x) dP(X=x)[/itex] is true, and is know as the "law of the unconscious statistician", because it was reputed to be used indiscriminately by statisticians without knowing that it is a *theorem* rather than a definition. It can be found in Section 7 of
http://www.math.harvard.edu/~lauren/154/Outline8-DRV-Expect2009.pdf .
There the discussion is pretty elementary but also quite detailed. After reading it you will understand why your teacher's explanation is only partly correct. This link is much more informative than the one you gave above.

RGV
 
Last edited by a moderator:

FAQ: Simple Expectation Value Question

What is a Simple Expectation Value Question?

A Simple Expectation Value Question is a type of question that asks for the average value or probability of a specific outcome in a given experiment or system. It is commonly used in physics, statistics, and other fields of science to calculate the expected results of a particular scenario.

How do you calculate the Simple Expectation Value?

The Simple Expectation Value is calculated by multiplying each possible outcome of a given experiment by its corresponding probability and then summing up all the products. It is represented by the symbol E and can be written as E = ∑ xiP(xi), where xi represents each possible outcome and P(xi) represents the probability of that outcome.

What is the significance of the Simple Expectation Value in scientific experiments?

The Simple Expectation Value is a useful tool in scientific experiments as it allows researchers to estimate the most likely outcome of a particular scenario. It helps in making predictions and analyzing the results of experiments.

What are some examples of Simple Expectation Value questions?

Some examples of Simple Expectation Value questions include: What is the average height of students in a class? What is the probability of rolling a 6 on a fair die? What is the expected value of a stock over the next year? These types of questions involve determining the average or most likely outcome of a particular event or scenario.

How is the Simple Expectation Value different from the Mean or Average?

The Simple Expectation Value and the Mean or Average are very similar concepts, but they are not the same. The Mean or Average refers to the sum of all values divided by the total number of values, while the Simple Expectation Value takes into account the probabilities of each value. In other words, the Simple Expectation Value is a weighted average, while the Mean is a regular average.

Similar threads

Replies
16
Views
2K
Replies
3
Views
2K
Replies
10
Views
2K
Replies
1
Views
2K
Replies
1
Views
1K
Replies
6
Views
2K
Back
Top