Probability: What are p.d.f.'s of x+y and x/y?

  • Thread starter sanctifier
  • Start date
  • Tags
    Probability
In summary, the given conversation discusses the probability density function of joint distribution of random variables X and Y, which is given by an exponential function when both x and y are greater than 0. The p.d.f.'s of X + Y and X/Y are derived using the Jacobian transformation and the marginal p.d.f. is also calculated. However, it is found that the expectation of X/Y does not exist.
  • #1
sanctifier
58
0

Homework Statement



The probability desity function (p.d.f.) of joint distribution of random variables X and Y is given as

[itex] f(x,y) = \begin{cases} e^{-(x + y)} \;\; when \;\; x > 0 \\ 0 \;\; \;\;\;\;\;\;\;\;\;\;otherwise \end{cases} [/itex]

Question 1: What are the p.d.f.'s of X + Y and X/Y ?

Question 2: Does the expectation of X/Y exist ?

Homework Equations



Nothing special.

The Attempt at a Solution



Answer 1:

[itex] \begin{cases} u = x \\ v = x + y \end{cases} [/itex]

[itex] \begin{cases} x = u \\ y = v - u \end{cases} [/itex]

[itex] Jacobian = \begin{bmatrix} \frac{dx}{du} & \frac{dx}{dv} \\\frac{dy}{du} & \frac{dy}{dv} \end{bmatrix} = \begin{bmatrix}1 & 0 \\{-1} & 1 \end{bmatrix} [/itex]

[itex] g(u,v)=f(u,v-u)|Jacobian|= e^{-v} [/itex]

[itex] h(x+y)=h(v) = \int_0^v g(u,v)du = e^{-v} u |_{u=0}^{u=v} = ve^{-v} [/itex]

[itex]\begin{cases}z = x\\ w = \frac{x}{y} \end{cases} [/itex]

[itex]\begin{cases} x = z\\ y = \frac{z}{w} \end{cases} [/itex]

[itex]Jacobian2 = \begin{bmatrix} \frac{dx}{dz} & \frac{dx}{dw} \\\frac{dy}{dz} & \frac{dy}{dw} \end{bmatrix} = \begin{bmatrix}1 & 0 \\{ \frac{1}{w} } & {- \frac{z}{ w^{2}} } \end{bmatrix}[/itex]

[itex]g2(z,w)=f(z, \frac{z}{w} )|Jacobian2|= e^{-z- \frac{z}{w} } \frac{z}{ w^{2} } [/itex]

[itex]h2(w)= \int_0^ \infty g2(z,w)dz = \int_0^ \infty e^{-z- \frac{z}{w} } \frac{z}{ w^{2} } dz = -\int_0^ \infty \frac{z}{ w^{2} } {(1+ \frac{1}{w} )}^{-1} d e^{-z(1+ \frac{1}{w})} = 0 + \int_0^\infty \frac{e^{-z(1+ \frac{1}{w})}}{ w^{2} \frac{1}{w}} dz = - \frac{1}{ {w+1}^{2} } e^{-z(1+ \frac{1}{w})} |_{z=0}^{z= \infty }= \frac{1}{{w+1}^{2}} [/itex]

Answer 2:

[itex]E( \frac{x}{y} )=E(w)= \int_0^\infty \frac{1}{{w+1}^{2}} dw = -\frac{1}{w+1}|_{w=0}^{w=\infty} = 1[/itex]

Are the two answers correct? Thank you in advance.
 
Physics news on Phys.org
  • #2
sanctifier said:

Homework Statement



The probability desity function (p.d.f.) of joint distribution of random variables X and Y is given as

[itex] f(x,y) = \begin{cases} e^{-(x + y)} \;\; when \;\; x > 0 \\ 0 \;\; \;\;\;\;\;\;\;\;\;\;otherwise \end{cases} [/itex]

Question 1: What are the p.d.f.'s of X + Y and X/Y ?

Question 2: Does the expectation of X/Y exist ?

Homework Equations



Nothing special.

The Attempt at a Solution



Answer 1:

[itex] \begin{cases} u = x \\ v = x + y \end{cases} [/itex]

[itex] \begin{cases} x = u \\ y = v - u \end{cases} [/itex]

[itex] Jacobian = \begin{bmatrix} \frac{dx}{du} & \frac{dx}{dv} \\\frac{dy}{du} & \frac{dy}{dv} \end{bmatrix} = \begin{bmatrix}1 & 0 \\{-1} & 1 \end{bmatrix} [/itex]

[itex] g(u,v)=f(u,v-u)|Jacobian|= e^{-v} [/itex]

[itex] h(x+y)=h(v) = \int_0^v g(u,v)du = e^{-v} u |_{u=0}^{u=v} = ve^{-v} [/itex]

[itex]\begin{cases}z = x\\ w = \frac{x}{y} \end{cases} [/itex]

[itex]\begin{cases} x = z\\ y = \frac{z}{w} \end{cases} [/itex]

[itex]Jacobian2 = \begin{bmatrix} \frac{dx}{dz} & \frac{dx}{dw} \\\frac{dy}{dz} & \frac{dy}{dw} \end{bmatrix} = \begin{bmatrix}1 & 0 \\{ \frac{1}{w} } & {- \frac{z}{ w^{2}} } \end{bmatrix}[/itex]

[itex]g2(z,w)=f(z, \frac{z}{w} )|Jacobian2|= e^{-z- \frac{z}{w} } \frac{z}{ w^{2} } [/itex]

[itex]h2(w)= \int_0^ \infty g2(z,w)dz = \int_0^ \infty e^{-z- \frac{z}{w} } \frac{z}{ w^{2} } dz = -\int_0^ \infty \frac{z}{ w^{2} } {(1+ \frac{1}{w} )}^{-1} d e^{-z(1+ \frac{1}{w})} = 0 + \int_0^\infty \frac{e^{-z(1+ \frac{1}{w})}}{ w^{2} \frac{1}{w}} dz = - \frac{1}{ {w+1}^{2} } e^{-z(1+ \frac{1}{w})} |_{z=0}^{z= \infty }= \frac{1}{{w+1}^{2}} [/itex]

Answer 2:

[itex]E( \frac{x}{y} )=E(w)= \int_0^\infty \frac{1}{{w+1}^{2}} dw = -\frac{1}{w+1}|_{w=0}^{w=\infty} = 1[/itex]

Are the two answers correct? Thank you in advance.

There is something wrong with the question as stated: your joint density ##f(x,y)## blows up as ##y \to -\infty##, and does not have a finite integral over ##y \in (-\infty,\infty)##. You restrict x but not y.

You never define for us what is meant by ##h(w)## and ##h2(w)##, so we end up having to try to guess---a good way to lose marks on an assignment.

Also, when you write
[tex] \frac{1}{w+1^2}[/tex]
you are writing
[tex] \frac{1}{w+1}[/tex]
If you really mean
[tex] \frac{1}{(w+1)^2}[/tex]
then use parentheses. BTW: that last form is the correct density of ##X/Y## at ##w \geq 0##.

Finally, the integral you need for ##E(X/Y)## is not the integral you wrote.
 
  • Like
Likes 1 person
  • #3
Thank you very much for your reply, Ray.

The p.d.f. should be

[itex] f(x,y) = \begin{cases} e^{-(x + y)} \;\; when \;\; x > 0 \;\; y > 0\\ 0 \;\; \;\;\;\;\;\;\;\;\;\;otherwise \end{cases} [/itex]

I forgot to write the part [itex]y > 0[/itex], my mistake.

[itex] h(v)=h(x+y) [/itex] actually is the marginal p.d.f. of [itex] v [/itex]

Similarly, [itex] h2(w) =h2( \frac{x}{y} ) [/itex] is the marginal p.d.f. of [itex] w [/itex]

The derivation of h2(w) should be

[itex]h2(w)= \int_0^ \infty g2(z,w)dz = \int_0^ \infty e^{-z- \frac{z}{w} } \frac{z}{ w^{2} } dz = -\int_0^ \infty \frac{z}{ w^{2} } {(1+ \frac{1}{w} )}^{-1} d e^{-z(1+ \frac{1}{w})} = 0 + \int_0^\infty \frac{e^{-z(1+ \frac{1}{w})}}{ w^{2} \frac{1}{w}} dz = - \frac{1}{ {(w+1)}^{2} } e^{-z(1+ \frac{1}{w})} |_{z=0}^{z= \infty }= \frac{1}{{(w+1)}^{2}} [/itex]

Yes, as you said, I lost the parentheses when writing [itex] \frac{1}{{(w+1)}^{2}} [/itex]

The last integral is wrong, it should be

[itex] E( \frac{x}{y} )=E(w)=\int_0^ \infty w \frac{1}{{(w+1)}^{2}} dw = \int_0^ \infty (\frac{w+1}{{(w+1)}^{2}} - \frac{1}{{(w+1)}^{2}} )dw= \int_0^ \infty \frac{1}{{(w+1)}^{2}} d(w+1)^2 - \int_0^ \infty \frac{1}{{(w+1)}^{2}} dw = ln(w+1)^2|_{w=0}^{w= \infty } + \frac{1}{w+1}|_{w=0}^{w= \infty } = \infty [/itex]

Hence, [itex] E( \frac{x}{y} ) [/itex] doesn’t exist.
 
  • #4
You are correct now.
 
  • #5
Thank you again, Ray.
 

FAQ: Probability: What are p.d.f.'s of x+y and x/y?

1. What is a probability density function (p.d.f.)?

A probability density function (p.d.f.) is a mathematical function that describes the probability of a random variable falling within a particular range of values. It is used to model continuous probability distributions.

2. What is the difference between p.d.f.'s of x+y and x/y?

The p.d.f. of x+y represents the probability of the sum of two random variables, while the p.d.f. of x/y represents the probability of the quotient of two random variables. In other words, the p.d.f. of x+y describes the distribution of the total value of two variables, while the p.d.f. of x/y describes the distribution of the ratio of two variables.

3. How are p.d.f.'s of x+y and x/y calculated?

The p.d.f. of x+y is calculated by convolving the individual p.d.f.'s of x and y. This means multiplying the two p.d.f.'s together and integrating over all possible values of x and y. The p.d.f. of x/y is calculated using the formula p(x/y) = p(x) / |y|, where |y| is the absolute value of y.

4. What does the shape of a p.d.f. tell us about the probability distribution?

The shape of a p.d.f. can tell us a lot about the probability distribution of a random variable. For example, a symmetrical p.d.f. with a single peak indicates that the random variable is likely to take on values near the peak with decreasing probability as we move away from the peak. A skewed p.d.f. with a longer tail on one side indicates that the probability of the random variable taking on higher values is higher than the probability of it taking on lower values.

5. What is the relationship between p.d.f.'s and cumulative distribution functions (c.d.f.'s)?

The p.d.f. and c.d.f. are two different ways of representing the same information about a probability distribution. The p.d.f. gives the probability of a random variable taking on a specific value, while the c.d.f. gives the probability of the random variable being less than or equal to a certain value. The c.d.f. can be obtained by integrating the p.d.f. over a range of values.

Back
Top