Some considerations about conditional normal distribution....

In summary: Basic Probability and Statistic forum was whether or not the conditional probability distribution function of two random variables X and Y, given that they are not independent and all have normal probability distributions, will also have a normal distribution. After clarifying the definitions of conditional probability distribution functions, it was concluded that the answer is yes, due to the symmetry of the definitions. This was confirmed through explicit expressions and calculations of $f_{Y|X}$ and $f_{X|Y}$, which showed that they both have normal distributions.
  • #1
chisigma
Gold Member
MHB
1,628
0
Scope of this thread is to give a complete as possible answer to the question proposed two days ago by the user simon11 on Basic Probability and Statistic forum...

Assume two random variables X and Y are not independent, if P(X), P(Y) and P(Y|X) are all normal, then does P(X|Y) also can only be normal or not necessarily?...

My 'almost automatic' answer has been 'yes!... P(X|Y) must necessarly be a normal distribution too...', but other members of MHB expressed critics or doubts about that, so I intend to clarify all the aspects not enough clear of the problem. The first step to perform the task is to remember the definition of conditional distribution function. According to...

Conditional probability distribution - Wikipedia, the free encyclopedia

... if the r.v. X has p.d.f. $f_{X} (x)$, the r.v. Y has p.d.f. $f_{Y}(y)$, and X and Y have joint density function $f_{X,Y} (x.y)$, then the conditional probability distribution functions of X and Y, one conditioned by the other, are...

$\displaystyle f_{Y} (y|X=x) = f_{Y|X} (x,y) = \frac{f_{X,Y} (x,y)}{f_{X}(x)}$ (1)

$\displaystyle f_{X} (x|Y=y) = f_{X|Y} (x,y) = \frac{f_{X,Y} (x,y)}{f_{Y}(y)}$ (2)

Very well!... now the basic definitions (1) and (2) give the answer to the question of simon11... why?... observing (1) and (2) it is fully evident their intrinsic symmetry respect to the X and Y, so that is always possible to swap the roles of X and Y and if $f_{X}$,$f_{Y}$ and $f_{Y|X}$ have the same property, no matter which is the property, also $f_{X|Y}$ has that property. After some marginal clarifications, simon11 seems to have been satisfied by the answer. One member of the staff of MHB however wasn’t and required a ‘formal proof’. Well!... in order to do that the first step is to find, under the assumption that X and Y are normal r.v., the explicit expressions of $f_{X}$,$f_{Y}$ and $f_{X,Y}$ and use (1) and (2) to obtain $f_{X|Y}$ and $f_{Y|X}$. No matter of course for the first two...

$\displaystyle f_{X}(x)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{X}}\ e^{- \frac{(x-\mu_{X})^{2}}{2\ \sigma^{2}_{X}}}$ (3)

$\displaystyle f_{Y}(y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{Y}}\ e^{- \frac{(y-\mu_{Y})^{2}}{2\ \sigma^{2}_{Y}}}$ (4)

... but how to say about $f_{X,Y}$?... 'Monster Wolfram' helps us...

Bivariate Normal Distribution -- from Wolfram MathWorld

$\displaystyle f_{X,Y} (x,y)= \frac{1}{2\ \pi\ \sigma_{X}\ \sigma_{Y}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{z}{2\ (1-\rho^{2})}}$ (5)

... where...

$\displaystyle z= \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}} - 2\ \frac{\rho\ (x-\mu_{X})\ (y-\mu_{Y})}{\sigma_{X}\ \sigma_{Y}} + \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}}$ (6)

$\displaystyle \rho= \text{cor}\ (X,Y)= \frac{V_{X,Y}}{\sigma_{X}\ \sigma_{Y}}$ (7)

Usually $\rho$ is called 'correlation' of X and Y and $V_{X,Y}$ is called 'covariance' of X and Y. The (6) and (7) are very interesting and 'suggestive' because the presence of the term $\rho$. In X and Y independent [or more precisely unrelated...], then $\rho=0$, if not [and that is the case proposed by simon11...] an 'extra term' must be taken into account. Now we are able, using (1) and (2), to compute $f_{Y|X} (x,y)$ and $f_{X|Y} (x,y)$ with a symple division...

$\displaystyle f_{Y|X} (x,y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{Y}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{u}{2\ (1-\rho^{2})}}$ (8)

... where...

$\displaystyle u= \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}} -2\ \rho\ \frac{(y-\mu_{Y})\ (x-\mu_{X})}{\sigma_{Y}\ \sigma_{X}} + \rho^{2}\ \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}}$ (9)

$\displaystyle f_{X|Y} (x,y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{X}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{v}{2\ (1-\rho^{2})}}$ (10)

... where...

$\displaystyle v= \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}} -2\ \rho\ \frac{(x-\mu_{X})\ (y-\mu_{Y})}{\sigma_{X}\ \sigma_{Y}} + \rho^{2}\ \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}}$ (11)

So 'finally' we are arrived to an explicit expression for $f_{Y|X}$ and $f_{X|Y}$ in the general case where X and Y are not independent. A I said before $f_{Y|X}$ is obtained from $f_{X|Y}$ swapping the role of X and Y... of course!... now by integration one can compute, if desired, $\mu_{Y|X}$, $\mu_{X|Y}$,$\sigma^{2}_{Y|X}$, $\sigma^{2}_{X|Y}$ and other interesting parameters... now I'm a little tired and that will be made, in case, in a successive post... Kind regards $\chi$ $\sigma$
 
Last edited:
Physics news on Phys.org
  • #2
chisigma said:
Scope of this thread is to give a complete as possible answer to the question proposed two days ago by the user simon11 on Basic Probability and Statistic forum...

Assume two random variables X and Y are not independent, if P(X), P(Y) and P(Y|X) are all normal, then does P(X|Y) also can only be normal or not necessarily?...

My 'almost automatic' answer has been 'yes!... P(X|Y) must necessarly be a normal distribution too...', but other members of MHB expressed critics or doubts about that, so I intend to clarify all the aspects not enough clear of the problem. The first step to perform the task is to remember the definition of conditional distribution function. According to...

Conditional probability distribution - Wikipedia, the free encyclopedia

... if the r.v. X has p.d.f. $f_{X} (x)$, the r.v. Y has p.d.f. $f_{Y}(y)$, and X and Y have joint density function $f_{X,Y} (x.y)$, then the conditional probability distribution functions of X and Y, one conditioned by the other, are...

$\displaystyle f_{Y} (y|X=x) = f_{Y|X} (x,y) = \frac{f_{X,Y} (x,y)}{f_{X}(x)}$ (1)

$\displaystyle f_{X} (x|Y=y) = f_{X|Y} (x,y) = \frac{f_{X,Y} (x,y)}{f_{Y}(y)}$ (2)

Very well!... now the basic definitions (1) and (2) give the answer to the question of simon11... why?... observing (1) and (2) it is fully evident their intrinsic symmetry respect to the X and Y, so that is always possible to swap the roles of X and Y and if $f_{X}$,$f_{Y}$ and $f_{Y|X}$ have the same property, no matter which is the property, also $f_{X|Y}$ has that property. After some marginal clarifications, simon11 seems to have been satisfied by the answer. One member of the staff of MHB however wasn’t and required a ‘formal proof’. Well!... in order to do that the first step is to find, under the assumption that X and Y are normal r.v., the explicit expressions of $f_{X}$,$f_{Y}$ and $f_{X,Y}$ and use (1) and (2) to obtain $f_{X|Y}$ and $f_{Y|X}$. No matter of course for the first two...

$\displaystyle f_{X}(x)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{X}}\ e^{- \frac{(x-\mu_{X})^{2}}{2\ \sigma^{2}_{X}}}$ (3)

$\displaystyle f_{Y}(y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{Y}}\ e^{- \frac{(y-\mu_{Y})^{2}}{2\ \sigma^{2}_{Y}}}$ (4)

... but how to say about $f_{X,Y}$?... 'Monster Wolfram' helps us...

Bivariate Normal Distribution -- from Wolfram MathWorld

$\displaystyle f_{X,Y} (x,y)= \frac{1}{2\ \pi\ \sigma_{X}\ \sigma_{Y}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{z}{2\ (1-\rho^{2})}}$ (5)

... where...

$\displaystyle z= \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}} - 2\ \frac{\rho\ (x-\mu_{X})\ (y-\mu_{Y})}{\sigma_{X}\ \sigma_{Y}} + \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}}$ (6)

$\displaystyle \rho= \text{cor}\ (X,Y)= \frac{V_{X,Y}}{\sigma_{X}\ \sigma_{Y}}$ (7)

Usually $\rho$ is called 'correlation' of X and Y and $V_{X,Y}$ is called 'covariance' of X and Y. The (6) and (7) are very interesting and 'suggestive' because the presence of the term $\rho$. In X and Y independent [or more precisely unrelated...], then $\rho=0$, if not [and that is the case proposed by simon11...] an 'extra term' must be taken into account. Now we are able, using (1) and (2), to compute $f_{Y|X} (x,y)$ and $f_{X|Y} (x,y)$ with a symple division...

$\displaystyle f_{Y|X} (x,y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{Y}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{u}{2\ (1-\rho^{2})}}$ (8)

... where...

$\displaystyle u= \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}} -2\ \rho\ \frac{(y-\mu_{Y})\ (x-\mu_{X})}{\sigma_{Y}\ \sigma_{X}} + \rho^{2}\ \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}}$ (9)

$\displaystyle f_{X|Y} (x,y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{X}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{v}{2\ (1-\rho^{2})}}$ (10)

... where...

$\displaystyle v= \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}} -2\ \rho\ \frac{(x-\mu_{X})\ (y-\mu_{Y})}{\sigma_{X}\ \sigma_{Y}} + \rho^{2}\ \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}}$ (11)

So 'finally' we are arrived to an explicit expression for $f_{Y|X}$ and $f_{X|Y}$ in the general case where X and Y are not independent. A I said before $f_{Y|X}$ is obtained from $f_{X|Y}$ swapping the role of X and Y... of course!... now by integration one can compute, if desired, $\mu_{Y|X}$, $\mu_{X|Y}$,$\sigma^{2}_{Y|X}$, $\sigma^{2}_{X|Y}$ and other interesting parameters... now I'm a little tired and that will be made, in case, in a successive post... Kind regards $\chi$ $\sigma$

You seem to be assuming that normal marginals implies joint normality. This is false.

If you could prove that both marginals normal plus one conditional normal implies that the joint distribution is normal you would be done. But this requires that the variance of the conditional be independent of the conditioning value, which we would have to justify.CB
 
Last edited:

FAQ: Some considerations about conditional normal distribution....

What is a conditional normal distribution?

A conditional normal distribution is a statistical distribution that describes the probability of a variable or set of variables, given the value of another variable. It is a type of normal distribution where the values are dependent on a certain condition being met.

How is a conditional normal distribution different from a standard normal distribution?

A standard normal distribution describes the probability of a variable without any conditions or restrictions. In contrast, a conditional normal distribution takes into account a specific condition that affects the values of the variable being studied.

What are some practical applications of conditional normal distribution?

Conditional normal distribution is commonly used in finance, economics, and other fields where variables are dependent on certain conditions. It can also be used in predictive modeling, risk analysis, and decision-making processes.

How is a conditional normal distribution calculated?

The calculation of a conditional normal distribution involves using the standard normal distribution formula and incorporating the specific condition or variable that is being considered. This can be done manually or with the use of statistical software.

What are some limitations of using conditional normal distribution?

One limitation of conditional normal distribution is that it assumes a linear relationship between variables, which may not always be the case in real-world scenarios. It also requires a large sample size to accurately estimate the distribution parameters. Additionally, it may not be suitable for non-normal distributions or when there are significant outliers in the data.

Similar threads

Back
Top