Distribution of R from Sum of Squared RVs

  • Thread starter TranscendArcu
  • Start date
  • Tags
    Sum
In summary, the distribution of the random variable R, which is the square root of the sum of the squares of two independent Gaussian random variables, can be found by using the moment-generating function or characteristic function and then taking the inverse transform. The resulting distribution may not have a well-known name, but it is a valid distribution that can be used for further calculations and analyses.
  • #1
TranscendArcu
285
0

Homework Statement


Let X,Y be two gaussian random variables that neither have the same variance nor the same mean necessarily, and both may be nonstandard. If I were to construct an RV of the form
R = √(X2 + Y2), how would this RV be distributed?

The Attempt at a Solution



So I've given this a fair amount of thought. At first, I thought it might be distributed as a Rician, but that seems to require that both RVs have the same variance. Eventually I concluded that it might be Chi distributed. However, since I never divide by the standard deviation of either RV (as would be required to construct either a noncentral Chi or a noncentral Chi-Squared), this also fails to work. Does anyone have any ideas for finding the distribution of two RVs combined in this way?
 
Physics news on Phys.org
  • #2


I would approach this problem by first considering the properties of the RV R that we are trying to find the distribution for. We know that R is the square root of the sum of the squares of two Gaussian random variables, X and Y. This means that R is a non-negative continuous random variable.

Next, I would consider the moment-generating function (MGF) of R, which is defined as E[e^(tR)]. Using the properties of MGFs, we can see that the MGF of R would be the product of the MGFs of X and Y, since they are independent. This would give us an expression for the MGF of R in terms of the MGFs of X and Y.

From there, we can use the inverse transform method to find the probability density function (PDF) of R. This involves taking the inverse Laplace transform of the MGF of R. This can be a complex and lengthy process, but it would ultimately give us the distribution of R.

Another approach could be to use the characteristic function of R, which is the Fourier transform of the PDF. Again, we can use the independence of X and Y to find the characteristic function of R and then take the inverse Fourier transform to find the PDF.

In either case, the resulting distribution of R may not have a well-known name, but it would be a valid distribution that describes the random variable R. We could then use this distribution to calculate the mean, variance, and other properties of R, and also use it in any further calculations or analyses.

Overall, finding the distribution of R would require some mathematical calculations and possibly some approximations, but it is certainly possible to determine the distribution using the properties of R and the techniques of probability and statistics.
 

FAQ: Distribution of R from Sum of Squared RVs

What is the distribution of R if it is the sum of two independent squared random variables?

The distribution of R is known as the non-central chi-square distribution, which is a special case of the gamma distribution. It is used to model the sum of squared random variables that are independent and have a non-zero mean.

How do you calculate the mean of the distribution of R?

The mean of the non-central chi-square distribution is equal to the sum of the means of the two squared random variables that are being added together. In other words, if X and Y are the two independent squared random variables, then the mean of the distribution of R is E(R) = E(X) + E(Y).

Can the distribution of R be approximated by a normal distribution?

Yes, for large values of R, the distribution of R can be approximated by a normal distribution with mean and standard deviation equal to the mean and standard deviation of the non-central chi-square distribution, respectively. This is known as the central limit theorem.

How is the variance of the distribution of R calculated?

The variance of the non-central chi-square distribution can be calculated using the following formula: Var(R) = 2μ + 4σ2, where μ is the mean and σ2 is the variance of the squared random variables being added together.

What is the significance of the non-centrality parameter in the distribution of R?

The non-centrality parameter, denoted by λ, is a measure of the distance between the mean of the squared random variables and their common mean. It affects the shape and location of the non-central chi-square distribution, with larger values of λ resulting in a higher peak and longer tails. It is often used in hypothesis testing and confidence interval calculations.

Back
Top