[probability theory] prove E(X|X) = X

In summary: Thanks for the input.In summary, the author is trying to solve a problem and is unsure how to do it. The author states that for continuous random variables, E(Xh(Y)|Y) = h(Y) E(X|Y). The equality seems obvious enough, but they are unsure how to actually prove it.
  • #1
nonequilibrium
1,439
2

Homework Statement


Well, not really, but in essence that's the part I'm having trouble with. The actual question is
Show that for continuous random variables, [itex]E(Xh(Y)|Y) = h(Y) E(X|Y)[/itex].
The equality seems obvious enough, but I'm unsure how to actually prove it...

Homework Equations


N/A

The Attempt at a Solution


So it seems I have to prove that [itex]P( \{ E(Xh(Y)|Y) = h(Y) E(X|Y) \} ) = 1[/itex].

Can anybody tell me how they would start?

I shall say how I would proceed if pressed:

As I see it, E(Xh(Y)|Y) is actually a function with possible values [itex]E(X h(Y) | Y = y_0)[/itex] and we can write [itex]E(X h(Y) | Y = y_0) = \iint x h(y) f_{X,Y|Y=y_0}(x,y) \mathrm d x \mathrm d x[/itex] and since [itex]f_{X,Y|Y=y_0}(x,y) = f_{X|Y=y_0} \delta(y-y_0)[/itex] (or perhaps this needs to be proven too?) we get [itex]E(X h(Y) | Y = y_0) = h(y_0) \int x f_{X|Y=y_0}(x) \mathrm d x = h(y_0) E(X|Y) [/itex] and hence E(Xh(Y)|Y) is the same random variable as h(Y) E(X|Y).

Is this acceptable?
 
Physics news on Phys.org
  • #2
looks like you're on the right track, but careful as you're mixing notation a little

also an expectation is NOT a random variableAs X=x_0 the conditional probability distribution of X becomes a delta function
[tex] p_{X}(x|X=x_0)=\delta(x-x_0) [/tex]

Also you can write the joint distribution on terms of a single PDF and a conditional PDF
[tex] p_{X,Y}(x,y)=p_{Y}(y|X=x)p_X(x) [/tex]

and the conditional case becomes
[tex] p_{X,Y}(x,y|X=x_0)=p_{Y}(y|X=x_0)p_X(x|X=x_0) [/tex]
 
  • #3
Thanks for posting. I'm not sure what you are getting at in your post; is it for proving the delta-distribution equality? (in which case I get what you're saying and how it helps)

And as for:
also an expectation is NOT a random variable
I say I must disagree. E(X h(Y)|Y) is certainly a random variable, or at least that is what is meant but perhaps the book's notation is confusing and you're accustomed to writing it a different way. It means with this the random variable [itex]y_0 \mapsto E(X h(Y) | Y = y_0)[/itex]. It is like how E(X|Y) is also called a random variable, after all otherwise the fundamental equality E(E(X|Y))=E(X) could have no meaning.
 
  • #4
ok, yeah I think I get what you mean around E(X|Y), missed in the first and don't think I have dealt with these a heap. So the way I read it, effectively E(X|Y) = f(Y), the value will be dependent on the distribution of X, but the stochastic element comes form the variable Y. re-reading I see yo urpetty much had the delta part pegged, but that should back it up.

wouldn't it just be sufficent to show that
[tex]E(Xh(Y)|Y=y_0)= h(Y=y_0)E(X|Y=y_0), \ \forall y_0 [/tex]

this seems to be the direction you've headed in so far and seems acceptable

otherwise do you have a different definition for E(X|Y) to start from?
 
Last edited:
  • #5
so yeah i think what you have done is reasonable if you show the integration steps
 
  • #6
lanedance said:
looks like you're on the right track, but careful as you're mixing notation a little

also an expectation is NOT a random variable


As X=x_0 the conditional probability distribution of X becomes a delta function
[tex] p_{X}(x|X=x_0)=\delta(x-x_0) [/tex]

Also you can write the joint distribution on terms of a single PDF and a conditional PDF
[tex] p_{X,Y}(x,y)=p_{Y}(y|X=x)p_X(x) [/tex]

and the conditional case becomes
[tex] p_{X,Y}(x,y|X=x_0)=p_{Y}(y|X=x_0)p_X(x|X=x_0) [/tex]

A conditional expectation such as E(X|Y) _is_ a random variable, whose value on the event {Y = y} is E(X|Y = y). (Such matters typically do not occur in Probability 101, but do appear very much in senior or graduate level probability.)

RGV
 
  • #7
yeah its been a while... but we got there
 

Related to [probability theory] prove E(X|X) = X

1. What is the meaning of E(X|X) in probability theory?

E(X|X) is the conditional expectation of a random variable X given that X has already occurred. It represents the expected value of X when we already know the value of X.

2. How is E(X|X) calculated?

The formula for calculating E(X|X) is E(X|X) = ∑x P(X=x | X=x) * x, where x represents all possible values of X and P(X=x | X=x) represents the conditional probability of X=x given that X=x.

3. Can you provide an example of calculating E(X|X)?

Suppose we have a fair six-sided die and we want to find E(X|X), where X represents the number rolled. We know that X can only take values 1, 2, 3, 4, 5, or 6. So, the formula becomes E(X|X) = (1/6)*1 + (1/6)*2 + (1/6)*3 + (1/6)*4 + (1/6)*5 + (1/6)*6 = 3.5. This means that if we already know the number rolled on the die, the expected value of the number will still be 3.5.

4. Why is E(X|X) equal to X?

This is known as the law of iterated expectations, and it states that the expected value of the conditional expectation of a random variable is equal to the original random variable. In other words, knowing the expected value of X given X has already occurred does not change the expected value of X itself.

5. What is the significance of E(X|X) in probability theory?

E(X|X) is an important concept in probability theory as it allows us to calculate the expected value of a random variable based on partial information. It also has many applications in statistics and decision-making, such as in Bayesian analysis and predictive modeling.

Similar threads

Replies
6
Views
1K
Replies
8
Views
2K
Replies
1
Views
406
Replies
1
Views
201
Replies
1
Views
330
Replies
5
Views
1K
Back
Top