I How to show these random variables are independent?

  • I
  • Thread starter Thread starter psie
  • Start date Start date
  • Tags Tags
    Probability theory
AI Thread Summary
The discussion focuses on proving the independence of order statistics from independent Ge(p)-distributed random variables. The exercise requires demonstrating that if X1 and X2 are independent, then X(1) and X(2) - X(1) are also independent. Participants express confusion over the lack of derived results for order statistics in discrete distributions and consider using transformations of random variables to approach the problem. The solution involves calculating specific probabilities using indicator functions and conditional expectations. The discussion emphasizes the importance of these techniques in establishing the independence of the random variables.
psie
Messages
315
Reaction score
40
TL;DR Summary
I am studying order statistics in An Intermediate Course in Probability by Gut. First the author treats only continuous distributions. In a section on the joint distribution of the extreme order variables ##X_{(n)}=\max\{X_1,\ldots,X_n\}## and ##X_{(1)}=\min\{X_1,\ldots,X_n\}##, the author derives the density of the range. that is ##R_n=X_{(n)}-X_{(1)}##. Then there's an exercise which I simply do not understand why it's in that section.
The exercise that appears in the text is:

Exercise 2.5 The geometric distribution is a discrete analog of the exponential distribution in the sense of lack of memory. More precisely, show that if ##X_1## and ##X_2## are independent ##\text{Ge}(p)##-distributed random variables, then ##X_{(1)}## and ##X_{(2)}-X_{(1)}## are independent.

What I find confusing about this exercise is that the author has, up until now, not derived any results for order statistics when it comes to discrete distributions. I know the formula for the density of ##X_{(1)}## and the range when the underlying distribution is continuous, but these do not apply for discrete distribution. I was thinking going back to an earlier chapter where the author derives distributions of transformations of random variables. I was thinking I could assume ##X_2## to be greater than ##X_1## and then compute the pmf of their difference, but this doesn't feel like a sensible assumption, since after all, ##\max\{X_1,\ldots,X_n\}## is understood pointwise.

How would you go about solving this exercise?
 
Physics news on Phys.org
This has been solved. It's a bit of work writing it all down, but basically you want to compute the probabilities ## P\left(X_{(1)}=u\right)##, ##P\left(X_{(1)}=u, X_{(2)}=u+d \right)## and ##P\left(X_{(2)}-X_{(1)}=d \right)##. The first two are fairly straightforward, splitting up the probability using indicator functions. The third probability is a bit more tricky and conditional expectation will come in handy (in addition to indicator functions). In particular, the identity $$P(A)=E[\mathbf1_A]=E[E[\mathbf1_A\mid X]]=E[P(A\mid X)].$$
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...

Similar threads

Replies
8
Views
1K
Replies
5
Views
2K
Replies
11
Views
3K
Replies
1
Views
1K
Replies
9
Views
2K
Back
Top