Sum of random variables and Fourier transform

In summary, the conversation discusses the probability densities of independent random variables in \mathbb{R}^n and their Fourier transforms. It is stated that when n=1, the probability density of the random variable X_1+X_2 can be represented as a convolution of the probability densities of X_1 and X_2. However, when n>1, it is unclear what kind of variable change would work for this representation. A possible solution is suggested in the provided link.
  • #1
jostpuur
2,116
19
If [itex]X_1[/itex] and [itex]X_2[/itex] are independent random variables in [itex]\mathbb{R}^n[/itex], and [itex]\rho_{X_1}[/itex] and [itex]\rho_{X_2}[/itex] are their probability densities, then let [itex]\rho_{X_1+X_2}[/itex] be the probability density of the random variable [itex]X_1+X_2[/itex]. Is it true that

[tex]
\hat{\rho}_{X_1+X_2}(\xi) = \hat{\rho}_{X_1}(\xi)\hat{\rho}_{X_2}(\xi),
[/tex]

when [itex]\hat{\rho}[/itex] is the Fourier transform

[tex]
\hat{\rho}(\xi) = \int\limits_{\mathbb{R}^n} \rho(x)e^{-2\pi ix\cdot\xi} d^nx?
[/tex]

I believe it is true, but am unable to prove it when [itex]n>1[/itex]. If [itex]n=1[/itex], then I can show that

[tex]
\rho_{X_1+X_2}(x) = \int\limits_{-\infty}^{\infty}\rho_{X_1}(x-y)\rho_{X_2}(y) dy.
[/tex]

This follows from a suitable variable change:

[tex]
P(0\leq X_1+X_2\leq a) = \int\limits_{0\leq x_1+x_2\leq a} dx_1\; dx_2\; \rho_{X_1}(x_1)\rho_{X_2}(x_2)
= \int\limits_0^a\Big(\int\limits_{-\infty}^{\infty} \rho_{X_1}(x-y)\rho_{X_2}(y) dy\Big) dx
[/tex]

The result [itex]\hat{\rho}_{X_1+X_2} = \hat{\rho}_{X_1}\hat{\rho}_{X_2}[/itex] is then an easy result about convolutions and Fourier transforms.

If [itex]n>1[/itex], then a similar approach would start from the probability

[tex]
P(X_1+X_2\in [0,a_1]\times\cdots\times [0,a_n]) = \int\limits_{\mathbb{R}^n} \Big( \int\limits_{-x_1 + [0,a]^n} \rho_{X_1}(x_1) \rho_{X_2}(x_2) d^nx_2\Big)d^nx_1
[/tex]

but now I'm not sure what kind of variable change would work.
 
Physics news on Phys.org

FAQ: Sum of random variables and Fourier transform

What is the sum of random variables?

The sum of random variables is a mathematical concept used to measure the combined effect of two or more random variables. It is calculated by adding the individual values of each random variable together.

How is the sum of random variables related to Fourier transform?

The Fourier transform is a mathematical tool used to decompose a signal into its individual frequency components. The sum of random variables can be calculated using the Fourier transform by taking the inverse Fourier transform of the product of the individual Fourier transforms of each random variable.

Can the sum of random variables be used to model real-world scenarios?

Yes, the sum of random variables is commonly used in statistics and probability to model real-world scenarios. For example, it can be used to model the combined effect of multiple independent factors on a particular outcome.

How does the central limit theorem relate to the sum of random variables?

The central limit theorem states that the sum of a large number of independent random variables will tend towards a normal distribution. This is useful in statistics as it allows us to make predictions about the sum of random variables even when we do not know the specific distribution of each individual variable.

Can the sum of random variables be used to calculate probabilities?

Yes, the sum of random variables can be used to calculate probabilities. In fact, the sum of random variables is often used in probability calculations, such as finding the expected value or variance of a distribution.

Similar threads

Back
Top