On the expected value of a sum of a random number of r.v.s.

  • #1
psie
205
23
TL;DR Summary
I am confused about a proof concerning the expectation of a sum of a random number of random variables
There's a theorem in An Intermediate Course in Probability by Gut that says if ##E|X|<\infty\implies EX=g_X'(1)##, where ##g_X## is the probability generating function. Now, consider the r.v. ##S_N##, which is the sum of a random number ##N## of terms of i.i.d. r.v.s. ##X_1,X_2,\ldots## (everything's nonnegative integer-valued, and ##N## is independent of ##X_1,X_2,\ldots##). One can derive the probability generating function for ##S_N##, namely ##g_{S_N}(t)=g_N(g_X(t))##. I am now reading a theorem that states;

Theorem If ##EN<\infty## and ##E|X|<\infty##, then ##ES_N=EN\cdot EX##.

The author proves this using the theorem I stated in the beginning, namely that ##E|X|<\infty\implies EX=g_X'(1)##. What I don't understand is why we require ##EN<\infty## and ##E|X|<\infty##. For ##ES_N## to exist via generating functions, we require ##E|S_N|<\infty##, but I don't see how this means that we should require ##EN<\infty## and ##E|X|<\infty##.

One idea that comes to mind is the following, but I'm not sure if this is correct: $$E|S_N|=E(|X_1+\ldots +X_N|)\leq E(|X_1|+\ldots +|X_N|)=E (N|X_1|)=EN E|X_1|,$$and so we see that ##E|S_N|## is finite if ##EN## and ##E|X_1|## are finite, as required by theorem. But I'm doubting if ##E(|X_1|+\ldots +|X_N|)=E (N|X_1|)## is correct. Grateful for any confirmation or help.
 
Physics news on Phys.org
  • #2
You can start with
##E|S_N|=\sum_{k=1^\infty} P(N=k) E|X_1+..+X_k|##

And now you are doing triangle inequalities on fixed number of terms
 
  • Like
Likes psie
  • #3
Office_Shredder said:
You can start with
##E|S_N|=\sum_{k=1^\infty} P(N=k) E|X_1+..+X_k|##

And now you are doing triangle inequalities on fixed number of terms
Silly question maybe, but which variables is ##S_N## and consequently ##|S_N|## a function of? Certainly ##N##, but is it correct to say it is also a function of ##X_1,\ldots,X_N##?
 
  • #4
I think we should be able to write $$S_N = \sum_{j = 1}^{\infty}X_j \mathbf1_{j \leq N},$$ so ##S_N## is ##\sigma((Y_n)_{n\in\mathbb N})##-measurable, where ##Y_1=N, Y_2=X_1, Y_3=X_2, \ldots##.
 

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
399
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
861
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
194
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
975
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
503
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
160
Back
Top