Bivariate normal distribution from normal linear combination

In summary, the given proposition states that the linear combinations of independent normal random variables are also normal. This can be proven by looking at the moment-generating function, which shows that the resulting distribution is still normal with a mean equal to the sum of the individual means and a variance equal to the sum of the individual variances. The theory presented in section 5.4 of the conversation is relevant to this proof, as it involves finding appropriate coefficients to construct the linear combinations of the random variables. The proof can also be done using matrix theory, but it would require a lot of manual algebraic work and can be simplified by using a computer algebra package like Maple.
  • #1
fisher garry
63
1
upload_2019-1-8_17-53-23.png

I can't prove this proposition. I have however managed to prove that the linear combinations of the independent normal rv's are also normal by looking at it's mgf

$$E(e^{X_1+X_2+...+X_n})=E(e^{X_1})E(e^{X_2})...E(e^{X_n})$$
The mgf of a normal distribution is $$e^{\mu t}e^{\frac{t^2 \sigma^2}{2}}$$
$$E(e^{X_1+X_2+...+X_n})=e^{\mu_1 t}e^{\frac{t^2 \sigma_1^2}{2}}e^{\mu_2 t}e^{\frac{t^2 \sigma_2^2}{2}}...e^{\mu_n t}e^{\frac{t^2 \sigma_n^2}{2}}=e^{(\mu_1+\mu_2+...\mu_n) t}e^{\frac{t^2 (\sigma_1^2+\sigma_2^2+...+\sigma_n^2)}{2}}$$

Which is the normal distribution with mean $$\mu_1+\mu_2+...\mu_n$$ and variance $$\sigma_1^2+\sigma_2^2+...+\sigma_n^2$$

I know about the theory in section 5.4. Some of it is presented here
$$g(y_1,y_2)=f(x_1,x_2)|\frac{\partial(x_1,x_2)}{\partial(y_1,y_2)}|$$Can anyone show how the proof they refer to by section 5.4 and matrix theory goes?
 

Attachments

  • upload_2019-1-8_17-53-23.png
    upload_2019-1-8_17-53-23.png
    20.5 KB · Views: 959
Physics news on Phys.org
  • #2
fisher garry said:
View attachment 236979
I can't prove this proposition. I have however managed to prove that the linear combinations of the independent normal rv's are also normal by looking at it's mgf

$$E(e^{X_1+X_2+...+X_n})=E(e^{X_1})E(e^{X_2})...E(e^{X_n})$$
The mgf of a normal distribution is $$e^{\mu t}e^{\frac{t^2 \sigma^2}{2}}$$
$$E(e^{X_1+X_2+...+X_n})=e^{\mu_1 t}e^{\frac{t^2 \sigma_1^2}{2}}e^{\mu_2 t}e^{\frac{t^2 \sigma_2^2}{2}}...e^{\mu_n t}e^{\frac{t^2 \sigma_n^2}{2}}=e^{(\mu_1+\mu_2+...\mu_n) t}e^{\frac{t^2 (\sigma_1^2+\sigma_2^2+...+\sigma_n^2)}{2}}$$

Which is the normal distribution with mean $$\mu_1+\mu_2+...\mu_n$$ and variance $$\sigma_1^2+\sigma_2^2+...+\sigma_n^2$$

I know about the theory in section 5.4. Some of it is presented here
$$g(y_1,y_2)=f(x_1,x_2)|\frac{\partial(x_1,x_2)}{\partial(y_1,y_2)}|$$Can anyone show how the proof they refer to by section 5.4 and matrix theory goes?

I think you are looking at the wrong problem. The description implies that
$$
\begin{array}{rcl}
U&=& a_1 X_1 + a_2 X_2 + \cdots + a_n X_n \\
V&=& b_1 X_1 + b_2 X_2 + \cdots + b_n X_n
\end{array}
$$
Here the ##a_i## and ##b_i## are constants.

You can work out ##EU, EV, \text{Var}(U), \text{Var}(V)## and ##\text{Cov}(U,V)## in terms of the ##a_i, b_i## and the original ##\mu, \sigma## of the ##X_i##. Thus, you have a formula for the means and variance-covariance matrix in terms of ##a_i , b_i, i=1,2,\ldots, n##.

I think that what the question is asking for is the converse: given ##\mu, \sigma##, the means and the variance-covariance matrix of ##U,V##, it wants you to either find appropriate ##a_i, b_i## that will work (that is, give the right ##U,V##), or at least to show that such ##a_i, b_i## exist, even if not easy to find. Using moment-generating functions (as you did above) should be very helpful for this purpose.
 
  • #3
fisher garry said:
View attachment 236979
I can't prove this proposition. I have however managed to prove that the linear combinations of the independent normal rv's are also normal by looking at it's mgf

$$E(e^{X_1+X_2+...+X_n})=E(e^{X_1})E(e^{X_2})...E(e^{X_n})$$
The mgf of a normal distribution is $$e^{\mu t}e^{\frac{t^2 \sigma^2}{2}}$$
$$E(e^{X_1+X_2+...+X_n})=e^{\mu_1 t}e^{\frac{t^2 \sigma_1^2}{2}}e^{\mu_2 t}e^{\frac{t^2 \sigma_2^2}{2}}...e^{\mu_n t}e^{\frac{t^2 \sigma_n^2}{2}}=e^{(\mu_1+\mu_2+...\mu_n) t}e^{\frac{t^2 (\sigma_1^2+\sigma_2^2+...+\sigma_n^2)}{2}}$$

Which is the normal distribution with mean $$\mu_1+\mu_2+...\mu_n$$ and variance $$\sigma_1^2+\sigma_2^2+...+\sigma_n^2$$

I know about the theory in section 5.4. Some of it is presented here
$$g(y_1,y_2)=f(x_1,x_2)|\frac{\partial(x_1,x_2)}{\partial(y_1,y_2)}|$$Can anyone show how the proof they refer to by section 5.4 and matrix theory goes?

I won't give a lot of details, but will expand a bit on my previous post.

Presumably, if you are given the means ##EU, EV## and the variance-covariance matrix of the pair ##(U,V)##, you are allowed to specify just how to make up ##U,V## in terms of some iid random variables ##X_1, X_2, \ldots, X_n##. That is, we are allowed to choose an ##n## in an attempt to prove the result.

When we are given the above information about ##U## and ##V##, we are given five items of data: two means, two variances, and one covariance. So, we will need at least five "coefficients" altogether.

I tried ##U = a_1 X_1 + a_2 X_2## and ##V = b_1 X_1 + b_2 X_2 + b_3 X_3##. The means, variances and covariance of ##(U,V)## can be expressed algebraically in terms of the ##a_i, b_j## and the underlying ##\mu, \sigma## of the ##X_k.## Thus, we obtain five equations in the five variables ##a_1, a_2, b_1, b_2,b_3##.

I managed to get a solution, thus proving the result asked for; however, it would possibly take many hours (perhaps days) of algebraic work to do it manually, so I saved myself a lot of grief by using the computer algebra package Maple to do the heavy lifting.

I don't see how matrix properties would be useful here, but maybe the person setting the problem had another method in mind.
 
  • Like
Likes fisher garry

FAQ: Bivariate normal distribution from normal linear combination

What is a bivariate normal distribution?

A bivariate normal distribution is a probability distribution that describes the relationship between two variables that are both normally distributed. It is characterized by a bell-shaped curve and has two parameters, mean and variance, for each variable.

How is a bivariate normal distribution different from a univariate normal distribution?

A univariate normal distribution describes the probability distribution of a single variable, while a bivariate normal distribution describes the joint probability distribution of two variables. This means that the values of the two variables are not independent, but rather are correlated.

What is a normal linear combination?

A normal linear combination is a mathematical operation that combines two or more normally distributed variables to create a new variable. The resulting variable will also be normally distributed, and its mean and variance can be calculated using the means and variances of the original variables.

How is a bivariate normal distribution related to a normal linear combination?

A bivariate normal distribution can be thought of as a special case of a normal linear combination, where the two variables are perfectly correlated. This means that the correlation coefficient between the two variables is equal to 1, and the resulting distribution is symmetric about the line y=x.

What are some applications of the bivariate normal distribution and normal linear combination?

The bivariate normal distribution and normal linear combination are commonly used in statistical analysis and modeling, particularly in fields such as finance, economics, and psychology. They are also used in machine learning algorithms, such as principal component analysis, to reduce the dimensionality of data and identify underlying patterns and relationships between variables.

Back
Top