Finding the Probability Density Function for the Sum of Two Random Variables

In summary, the conversation discusses finding the probability density function of a random variable Z, which is the sum of two other random variables X and Y. X is uniformly distributed on a circle of radius a, while Y is a constant. The speaker mentions trying to take the Fourier transform of X and Y, but it doesn't seem to work. They also mention plotting Z in MATLAB and it appearing like an upside-down Gaussian distribution. The conversation ends with confusion about Z being complex instead of real.
  • #1
jmckennon
42
0
Hi,

I've been working on this problem but I feel like I'm over complicating it. If you have a random variable X= a*e(j*phi), where phi is uniform on the interval [0,2pi) and a is some constant, and another random variable Y= b where b is a constant. I'm looking to find the probability density function of the random variable Z=X+Y.

This is probably really simple but from what I've been trying to do, I can just take the Fourier transform of X, Fourier transform of Y multiply them, and then take the inverse Fourier of that, but it doesn't seem to work. How can I do this?
 
Physics news on Phys.org
  • #2
You haven't defined j. If I can assume you mean i (sqrt(-1)), then X (complex variable) is uniformly distributed on a circle of radius a, centered at 0. Z is then uniformly distributed on a circle of radius a centered at b.
 
  • #3
yes, i apologize, j is sqrt(-1). After defining in MATLAB phi=rand(1,M).*2*pi where M=1000, i plotted Z= b+a.*exp(j.*phi) for various values of a and b and it looked kinda like an upside gaussian distribution centered about pi. Is this right?
 
  • #4
*upside down gaussian distribution
 
  • #5
I'm confused about what you did, since Z is complex, not real.
 

FAQ: Finding the Probability Density Function for the Sum of Two Random Variables

What is the definition of a "sum of two random variables"?

The sum of two random variables is a new random variable that represents the combined result of two individual random variables. It is calculated by adding the values of the two random variables together.

How do you calculate the mean of a sum of two random variables?

The mean of a sum of two random variables is equal to the sum of the individual means of the two random variables. This can be represented mathematically as E(X+Y) = E(X) + E(Y).

Can the sum of two random variables be negative?

Yes, the sum of two random variables can be negative if one or both of the individual random variables have negative values. This is because the sum of two random variables is just another random variable, and it can take on any value within its range.

What is the difference between a sum of two independent random variables and a sum of two dependent random variables?

A sum of two independent random variables is when the two individual random variables are not influenced by each other and their outcomes are completely independent. A sum of two dependent random variables is when the two individual random variables are influenced by each other and their outcomes are correlated in some way.

Can the sum of two random variables be used to model real-world phenomena?

Yes, the sum of two random variables can be used to model real-world phenomena as it is a common way to represent the combined effect of multiple variables on an outcome. For example, the sum of two dice rolls can be used to model the results of a game of Monopoly.

Back
Top