Jointly continuous random variables

In summary, we are given a joint density function for two random variables, X and Y, and are asked to calculate the probability that their sum is less than 1. Using the given function, we can set up a double integral and solve for the probability using integration. However, the limits for the first integral should be 0 to 1 instead of 0 to infinity, which was the mistake made in the attempted solution provided. After correcting this, the correct solution was obtained.
  • #1
DotKite
81
1

Homework Statement



Let X and Y be random losses with joint density function

f(x,y) = e^-(x + y) for x > 0 and y > 0 and 0 elsewhere

An insurance policy is written to reimburse X + Y:
Calculate the probability that the reimbursement is less than 1.

Homework Equations



Have not learned independence for jointly cont r.v's yet



The Attempt at a Solution



p(X + Y < 1) = p(Y < 1 - X) = ##\int_{0}^{\infty}\int_{0}^{1-x} e^{-(x+y)} dydx##

When I go through solving this double integral I get the following

##-e^{-x} - xe^{-1}## evaluated from 0 to ∞.

However as x → ∞ the above function diverges. Maybe I calculated the integral wrong, I have done it over and over, and cannot seem to find where it could be wrong.
 
Physics news on Phys.org
  • #2
Never mind I figured out my mistake. The limits of the first integral should not be 0 to infinity. They are 0 to 1
 
  • #3
DotKite said:
Never mind I figured out my mistake. The limits of the first integral should not be 0 to infinity. They are 0 to 1

That was not your only mistake: you need to re-do the inner integral
[tex] \int_0^{1-x} e^{-(x+y)} \, dy = e^{-x} \int_0^{1-x} e^{-y} \, dy. [/tex]

Edit: Oh... maybe you wrote the indefinite x- integral of inner y-integral; in that case, you are correct.
 
Last edited:

FAQ: Jointly continuous random variables

What is the definition of jointly continuous random variables?

Jointly continuous random variables are two or more random variables that have a continuous probability distribution function and are defined over the same sample space.

How are jointly continuous random variables related to each other?

Jointly continuous random variables are related to each other in that the values of one variable can influence the values of the other variable(s) in a predictable manner.

What is the difference between jointly continuous random variables and independent random variables?

Jointly continuous random variables are dependent on each other, meaning that the values of one variable can affect the values of the other variable(s). In contrast, independent random variables have no influence on each other and their values are not related in any way.

Can jointly continuous random variables have different probability distributions?

Yes, jointly continuous random variables can have different probability distributions, as long as they are defined over the same sample space. This means that their values can vary, but they are still related to each other in a predictable manner.

How are the joint probability density function and the marginal probability density functions related for jointly continuous random variables?

The joint probability density function of jointly continuous random variables can be found by multiplying the marginal probability density functions. Additionally, the marginal probability density functions can be found by integrating the joint probability density function over the other variable(s).

Back
Top