- #1
MrFancy
- 3
- 0
I'm trying to solve a problem as part of my research and it's giving me fits. It seems like it should be simple, but I can't wrap my brain around how to do it. The problem is:
Suppose X~N(0,s), and Y is a random variable that has a probability mass point at 0 but is otherwise uniformally distributed on (0,t] so that:
f(y)=k, y=0
f(y)=(1-k)(1/t), 0 < y < t
f(y)=0 otherwise
What is
Pr(y < A | x + y > B)
where A and B are arbitrary constants?
I think I've calculated the convolution of X and Y, but I'm not sure how to get the density from there (and I'm not sure I have the convolution right either). Thanks for any help you can provide.
Suppose X~N(0,s), and Y is a random variable that has a probability mass point at 0 but is otherwise uniformally distributed on (0,t] so that:
f(y)=k, y=0
f(y)=(1-k)(1/t), 0 < y < t
f(y)=0 otherwise
What is
Pr(y < A | x + y > B)
where A and B are arbitrary constants?
I think I've calculated the convolution of X and Y, but I'm not sure how to get the density from there (and I'm not sure I have the convolution right either). Thanks for any help you can provide.