- #1
jostpuur
- 2,116
- 19
If [itex]X_1[/itex] and [itex]X_2[/itex] are independent random variables in [itex]\mathbb{R}^n[/itex], and [itex]\rho_{X_1}[/itex] and [itex]\rho_{X_2}[/itex] are their probability densities, then let [itex]\rho_{X_1+X_2}[/itex] be the probability density of the random variable [itex]X_1+X_2[/itex]. Is it true that
[tex]
\hat{\rho}_{X_1+X_2}(\xi) = \hat{\rho}_{X_1}(\xi)\hat{\rho}_{X_2}(\xi),
[/tex]
when [itex]\hat{\rho}[/itex] is the Fourier transform
[tex]
\hat{\rho}(\xi) = \int\limits_{\mathbb{R}^n} \rho(x)e^{-2\pi ix\cdot\xi} d^nx?
[/tex]
I believe it is true, but am unable to prove it when [itex]n>1[/itex]. If [itex]n=1[/itex], then I can show that
[tex]
\rho_{X_1+X_2}(x) = \int\limits_{-\infty}^{\infty}\rho_{X_1}(x-y)\rho_{X_2}(y) dy.
[/tex]
This follows from a suitable variable change:
[tex]
P(0\leq X_1+X_2\leq a) = \int\limits_{0\leq x_1+x_2\leq a} dx_1\; dx_2\; \rho_{X_1}(x_1)\rho_{X_2}(x_2)
= \int\limits_0^a\Big(\int\limits_{-\infty}^{\infty} \rho_{X_1}(x-y)\rho_{X_2}(y) dy\Big) dx
[/tex]
The result [itex]\hat{\rho}_{X_1+X_2} = \hat{\rho}_{X_1}\hat{\rho}_{X_2}[/itex] is then an easy result about convolutions and Fourier transforms.
If [itex]n>1[/itex], then a similar approach would start from the probability
[tex]
P(X_1+X_2\in [0,a_1]\times\cdots\times [0,a_n]) = \int\limits_{\mathbb{R}^n} \Big( \int\limits_{-x_1 + [0,a]^n} \rho_{X_1}(x_1) \rho_{X_2}(x_2) d^nx_2\Big)d^nx_1
[/tex]
but now I'm not sure what kind of variable change would work.
[tex]
\hat{\rho}_{X_1+X_2}(\xi) = \hat{\rho}_{X_1}(\xi)\hat{\rho}_{X_2}(\xi),
[/tex]
when [itex]\hat{\rho}[/itex] is the Fourier transform
[tex]
\hat{\rho}(\xi) = \int\limits_{\mathbb{R}^n} \rho(x)e^{-2\pi ix\cdot\xi} d^nx?
[/tex]
I believe it is true, but am unable to prove it when [itex]n>1[/itex]. If [itex]n=1[/itex], then I can show that
[tex]
\rho_{X_1+X_2}(x) = \int\limits_{-\infty}^{\infty}\rho_{X_1}(x-y)\rho_{X_2}(y) dy.
[/tex]
This follows from a suitable variable change:
[tex]
P(0\leq X_1+X_2\leq a) = \int\limits_{0\leq x_1+x_2\leq a} dx_1\; dx_2\; \rho_{X_1}(x_1)\rho_{X_2}(x_2)
= \int\limits_0^a\Big(\int\limits_{-\infty}^{\infty} \rho_{X_1}(x-y)\rho_{X_2}(y) dy\Big) dx
[/tex]
The result [itex]\hat{\rho}_{X_1+X_2} = \hat{\rho}_{X_1}\hat{\rho}_{X_2}[/itex] is then an easy result about convolutions and Fourier transforms.
If [itex]n>1[/itex], then a similar approach would start from the probability
[tex]
P(X_1+X_2\in [0,a_1]\times\cdots\times [0,a_n]) = \int\limits_{\mathbb{R}^n} \Big( \int\limits_{-x_1 + [0,a]^n} \rho_{X_1}(x_1) \rho_{X_2}(x_2) d^nx_2\Big)d^nx_1
[/tex]
but now I'm not sure what kind of variable change would work.