Independance of RVs with same distribution

In summary, the conversation discusses the independence and expectation of two Gaussian random variables with zero mean and unit variance. It is concluded that if the variables have the same distribution, they are not necessarily independent and the expectation of their product is not equal to the product of their individual expectations. The independence of the variables is determined by whether or not they are equal, with partial dependence being a possibility.
  • #1
nexp
2
0
[Solved] Independance of RVs with same distribution

Hey all,

Let's say we have two Gaussian random variables X, Y, each with zero mean and unit variance. Is it correct to say that [tex]P(X|Y) = P(X)[/tex]?

In other words, suppose that we want to compute the expectation of their product [tex]\operatorname{E}[XY][/tex]. Is the following correct? I.e. does their joint distribution factorise?

[tex]E[XY] = \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}x y\: p(x) p(y)\: dx dy[/tex]
[tex]= \int_{-\infty}^{\infty}x \:p(x)\: \operatorname{d} x \int_{-\infty}^{\infty} y \: p(y) \: \operatorname{d} y[/tex]
[tex]= \operatorname{E}[X]\operatorname{E}[Y] \nonumber[/tex]

Many Thanks.

Update

I have now figured out the answer to the above questions. I'll post it here for anyone who is interested.

If X and Y have the same distribution, then we can write [tex]P(X|X) = 1 \neq P(X)[/tex].

Now looking again at expectations. From the above, we have that

[tex]E[XY]=E[X^2][/tex]
[tex]=\int_{-\infty}^{\infty}x^2 p(x^2) \: dx[/tex]

similarly giving a negative answer for the expectation of the product.
 
Last edited:
Physics news on Phys.org
  • #2
It looks like you are confusing two different things. X and Y are assumed to be normal with the same distribution.

However whether of not they are independent is a completely separate question. If they are independent then your analysis before the update is correct. On the other hand if X=Y, then the comment after the update is valid.

These are the two extreme possibilities (partial dependence in between).
 
  • #3
Thanks very much. I understand what's going on better now.
 

FAQ: Independance of RVs with same distribution

What is independence of RVs with same distribution?

Independence of RVs with same distribution refers to two or more random variables that have the same probability distribution and are not affected by each other's outcomes. This means that the occurrence of one variable does not depend on the occurrence of the other variables.

Why is independence of RVs with same distribution important?

Independence of RVs with same distribution is important because it allows us to make certain assumptions and simplifications in statistical analysis and modeling. It also allows us to use certain statistical tests and methods that rely on the independence assumption.

How do you determine if two RVs have the same distribution?

To determine if two RVs have the same distribution, we can compare their probability density functions or cumulative distribution functions. If they are identical, then the RVs have the same distribution.

Can two RVs with the same distribution be correlated?

Yes, it is possible for two RVs with the same distribution to be correlated. This means that there is a relationship between the two variables, but it does not necessarily mean that one variable is causing the other.

What is the significance of independence of RVs with same distribution in real-world applications?

Independence of RVs with same distribution is important in many real-world applications, such as in finance, epidemiology, and quality control. It allows us to make accurate predictions and decisions based on statistical models and data analysis.

Similar threads

Back
Top