How to show these random variables are independent?

  • #1
psie
209
23
TL;DR Summary
I am studying order statistics in An Intermediate Course in Probability by Gut. First the author treats only continuous distributions. In a section on the joint distribution of the extreme order variables ##X_{(n)}=\max\{X_1,\ldots,X_n\}## and ##X_{(1)}=\min\{X_1,\ldots,X_n\}##, the author derives the density of the range. that is ##R_n=X_{(n)}-X_{(1)}##. Then there's an exercise which I simply do not understand why it's in that section.
The exercise that appears in the text is:

Exercise 2.5 The geometric distribution is a discrete analog of the exponential distribution in the sense of lack of memory. More precisely, show that if ##X_1## and ##X_2## are independent ##\text{Ge}(p)##-distributed random variables, then ##X_{(1)}## and ##X_{(2)}-X_{(1)}## are independent.

What I find confusing about this exercise is that the author has, up until now, not derived any results for order statistics when it comes to discrete distributions. I know the formula for the density of ##X_{(1)}## and the range when the underlying distribution is continuous, but these do not apply for discrete distribution. I was thinking going back to an earlier chapter where the author derives distributions of transformations of random variables. I was thinking I could assume ##X_2## to be greater than ##X_1## and then compute the pmf of their difference, but this doesn't feel like a sensible assumption, since after all, ##\max\{X_1,\ldots,X_n\}## is understood pointwise.

How would you go about solving this exercise?
 
Physics news on Phys.org
  • #2
This has been solved. It's a bit of work writing it all down, but basically you want to compute the probabilities ## P\left(X_{(1)}=u\right)##, ##P\left(X_{(1)}=u, X_{(2)}=u+d \right)## and ##P\left(X_{(2)}-X_{(1)}=d \right)##. The first two are fairly straightforward, splitting up the probability using indicator functions. The third probability is a bit more tricky and conditional expectation will come in handy (in addition to indicator functions). In particular, the identity $$P(A)=E[\mathbf1_A]=E[E[\mathbf1_A\mid X]]=E[P(A\mid X)].$$
 

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
147
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
980
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
405
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
785
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
0
Views
120
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
3K
Back
Top