Characteristic function (Probability)

In summary, the characteristic function of the sum of two distributions is the product of their characteristic functions.
  • #1
Zaare
54
0
How can I show that if [tex]\phi(t)[/tex] is a characteristic function for some distribution, then [tex]|\phi(t)|^2[/tex] is also a characteristic function?
 
Physics news on Phys.org
  • #2
Maybe you could directly work out the distribution for it?

Do you have some sort of theorem on what functions are characteristic functions?

(My text on this stuff is at work. :frown:)
 
  • #3
It is given that [tex] \phi_{X} (t) = E(e^{itX}) [/tex], and I know that [tex] E(e^{itX}) = \int_{-\infty}^{\infty}{e^{itx}f_X (x) dx} [/tex]
[tex]f_X (x)[/tex] is the probability density function.
I'm trying to find the distribution for it as you said, but I haven't succeeded yet.
 
Last edited:
  • #4
Right, here are the useful results google gave me in the first hit.

Let Y be another r.v. with distribtuion as -X, then its characteristic function out to be the complex conjugate of phi, I think. At least that seems plausible.

Given two distributions X and Y, the char function of their sum is the product of their char functions.

That is for sure, though I'm not sure about the first bit - you should try the integration
 
  • #5
matt grime said:
Let Y be another r.v. with distribtuion as -X, then its characteristic function out to be the complex conjugate of phi, I think. At least that seems plausible.
I think I can show this, but how does that help me?

matt grime said:
Given two distributions X and Y, the char function of their sum is the product of their char functions.
X and Y have to be independent for this to be ture, right? And I have to show that this is true for X and X even though X and X are not independent.
 
Last edited:
  • #6
Yes, so what;s the problem with having Y independent? I didn't say it was -X i said its distribution was -X.
 
  • #7
Ok, if Y has destribution as -X and X and Y are independent, I can show this:
[tex]
\left| {\phi _X \left( t \right)} \right|^2 = \left| {\phi _X \left( t \right)} \right| \times \left| {\overline {\phi _X \left( t \right)} } \right| = \left| {\phi _X \left( t \right)} \right| \times \left| {\phi _Y \left( t \right)} \right| = \left| {\phi _{X + Y} \left( t \right)} \right|
[/tex]
But is it enough? I don't know how to interpret this.
 
  • #8
By the way, this shows your first statement, right?

[tex]
\left. \begin{array}{l}
\phi _Y \left( t \right) = E\left[ {e^{itY} } \right] = E\left[ {e^{ - itX} } \right] = E\left[ {\cos x - i\sin x} \right] = E\left[ {\cos x} \right] - iE\left[ {\sin x} \right] \\
\phi _X \left( t \right) = E\left[ {e^{itX} } \right] = E\left[ {\cos x + i\sin x} \right] = E\left[ {\cos x} \right] + iE\left[ {\sin x} \right] \\
\end{array} \right\} \Rightarrow \underline{\underline {\phi _Y \left( t \right) = \overline {\phi _X \left( t \right)} }}
[/tex]

matt grime said:
Let Y be another r.v. with distribtuion as -X, then its characteristic function out to be the complex conjugate of phi, I think. At least that seems plausible.
 
  • #9
Yeah, that's it, though I was thinking of going for a change of variable in the integral, that was all.

If X is an r.v, so is Y, and so is X+Y... that's sufficient
 
  • #10
Thank you for all the help. :)
 

FAQ: Characteristic function (Probability)

1. What is a characteristic function in probability?

A characteristic function is a mathematical function that describes the probability distribution of a random variable. It is defined as the expected value of the complex exponential function of the random variable. In simpler terms, it is a function that uniquely identifies a probability distribution.

2. How is a characteristic function related to a probability density function?

A characteristic function and a probability density function (PDF) are closely related. The characteristic function is the Fourier transform of the PDF. This means that the characteristic function contains all the information about the PDF, such as the mean, variance, and higher moments.

3. What is the advantage of using characteristic functions in probability?

One advantage of using characteristic functions is that they can be used to find the probability distribution of a sum of independent random variables. This is known as the characteristic function of a sum property, and it makes calculations involving multiple random variables much easier.

4. Can a characteristic function be used to identify the type of probability distribution?

Yes, the shape of a characteristic function can be used to identify the type of probability distribution. For example, a Gaussian distribution will have a characteristic function with a bell-shaped curve, while a uniform distribution will have a characteristic function with a constant value.

5. Are there any limitations of using characteristic functions in probability?

One limitation of using characteristic functions is that they may not exist for all probability distributions. Some distributions, such as the Cauchy distribution, do not have a characteristic function. Additionally, calculating a characteristic function may be computationally intensive for complex probability distributions.

Similar threads

Replies
3
Views
1K
Replies
6
Views
2K
Replies
5
Views
2K
Replies
5
Views
2K
Replies
4
Views
3K
Replies
2
Views
1K
Replies
2
Views
334
Back
Top