- #1
Ant64
- 4
- 0
Hi all,
The question is: Given that X follows a geometric distribution, prove that
P(X>j+k | X>k) = P(X>j).
So this is all I have:
P(X>k+j) = (1-p)^k+j, (a proof covered in the text)
= (1-p)^k • (1-p)^j,
No, what I want to say is,
since P(X>k) = (1-p)^k = 1,
then P(X>k+j) = 1 • (1-p)^j,
= P(x>j).
But I don't know if I'm allowed to separate the expression (1-p)^k • (1-p)^j into separate probabilities... Thanks in advance!
The question is: Given that X follows a geometric distribution, prove that
P(X>j+k | X>k) = P(X>j).
So this is all I have:
P(X>k+j) = (1-p)^k+j, (a proof covered in the text)
= (1-p)^k • (1-p)^j,
No, what I want to say is,
since P(X>k) = (1-p)^k = 1,
then P(X>k+j) = 1 • (1-p)^j,
= P(x>j).
But I don't know if I'm allowed to separate the expression (1-p)^k • (1-p)^j into separate probabilities... Thanks in advance!
Last edited: