Need help simplifying a summation with binomials

In summary: Ah, I see the problem now. I forgot the ##\frac{1}{n!}## term. Now we get ##\sum_{n=0}^\infty \frac{(s\lambda)^n}{n!} \sum_{m=0}^\infty \frac{(s\mu)^m}{m!} = e^{-(\lambda+\mu)}\sum_{n=0}^\infty \sum_{m=0}^\infty \binom {m+n}{n}\binom {m+n}{m} \frac{(\lambda+\mu)^{m+n}s^{n+m}}{(m+n)!} = e^{-(\lambda+\mu)}\sum_{
  • #1
Eclair_de_XII
1,083
91

Homework Statement


"Prove that ##\sum_{n=0}^\infty s^n e^{-\lambda} \frac{\lambda^n}{n!}\sum_{m=0}^\infty s^m e^{-\mu}\frac{\mu^m}{m!}=\sum_{m+n=0}^\infty s^{n+m} e^{-(\lambda+\mu)} \frac{(\lambda + \mu)^{m+n}}{(m+n)}!##

Homework Equations


Binomial theorem: ##(x+y)^n=\sum_{k=0}^n x^ky^{n-k}##
Vandermonde's identity: ##\binom {n+m} m =\sum_{k=0}^m \binom n k \binom m {m-k}##

The Attempt at a Solution


##\sum_{n=0}^\infty s^n e^{-\lambda} \frac{\lambda^n}{n!}\sum_{m=0}^\infty s^m e^{-\mu}\frac{\mu^m}{m!}=e^{-(\lambda+\mu)}\sum_{n=0}^\infty \sum_{m=0}^\infty s^{n+m} \frac{\lambda^n \mu^m }{m!n!}=e^{-(\lambda+\mu)}\sum_{n=0}^\infty \sum_{m=0}^\infty s^{n+m} \binom {m+n}{n} \frac{\lambda^n \mu^m }{(m+n)!}##
##=e^{-(\lambda+\mu)}\sum_{n=0}^\infty \sum_{m=0}^\infty s^{n+m} \sum_{k=0}^n \binom m k \binom n k \frac{\lambda^n \mu^m }{(m+n)!}##

I'm afraid to go any further, because it won't get me my ##(\lambda + \mu)^{n+m}## term. If anyone has any pointers on what I should do next with this expression (or giving me another expression), or an alternate way to prove using probability-generating functions, that the sum of two independent Poisson r.v.'s have mean equal to the sum of each individual mean, then that would be much appreciated.
 
Last edited:
Physics news on Phys.org
  • #2
Binomial formula?
 
  • #3
Eclair_de_XII said:

Homework Statement


"Prove that ##\sum_{n=0}^\infty s^n e^{-\lambda} \frac{\lambda^n}{n!}\sum_{m=0}^\infty s^m e^{-\mu}\frac{\mu^m}{m!}=\sum_{m+n=0}^\infty s^{n+m} e^{-(\lambda+\mu)} \frac{(\lambda + \mu)^{m+n}}{(m+n)}!##

Homework Equations


Binomial theorem: ##(x+y)^n=\sum_{k=0}^n x^ky^{n-k}##
Vandermonde's identity: ##\binom {n+m} m =\sum_{k=0}^m \binom n k \binom m {m-k}##

The Attempt at a Solution


##\sum_{n=0}^\infty s^n e^{-\lambda} \frac{\lambda^n}{n!}\sum_{m=0}^\infty s^m e^{-\mu}\frac{\mu^m}{m!}=e^{-(\lambda+\mu)}\sum_{n=0}^\infty \sum_{m=0}^\infty s^{n+m} \frac{\lambda^n \mu^m }{m!n!}=e^{-(\lambda+\mu)}\sum_{n=0}^\infty \sum_{m=0}^\infty s^{n+m} \binom {m+n}{n} \frac{\lambda^n \mu^m }{(m+n)!}##
##=e^{-(\lambda+\mu)}\sum_{n=0}^\infty \sum_{m=0}^\infty s^{n+m} \sum_{k=0}^n \binom m k \binom n k \frac{\lambda^n \mu^m }{(m+n)!}##

I'm afraid to go any further, because it won't get me my ##(\lambda + \mu)^{n+m}## term. If anyone has any pointers on what I should do next with this expression (or giving me another expression), or an alternate way to prove using probability-generating functions, that the sum of two independent Poisson r.v.'s have mean equal to the sum of each individual mean, then that would be much appreciated.

Does your formula ##(x+y)^n=\sum_{k=0}^n x^ky^{n-k}## work for ##n = 2## or ##n = 3?##
 
  • #4
Eclair_de_XII said:
or an alternate way to prove using probability-generating functions, that the sum of two independent Poisson r.v.'s have mean equal to the sum of each individual mean, then that would be much appreciated.

Are you allowed to use the result that the moment generating function of a poission distribution with parameter ##\lambda## is ##M(t) = e^{\lambda (e^t -1)}## ?
 
  • #5
Stephen Tashi said:
Are you allowed to use the result that the moment generating function of a poission distribution with parameter ##\lambda## is ##M(t) = e^{\lambda (e^t -1)}## ?

That should be ##e^{\lambda (t-1)}.##
 
  • #7
Stephen Tashi said:

Sorry, no you are correct. I meant the moment-generating function of the probability mass function, while you meant the moment-generating function of the random variable. Of course, they are different. (Your terminology "generating function of a Poisson distribution" threw me: I have seen it used both ways in different sources.)

See, eg., https://web.ma.utexas.edu/users/gordanz/notes/lecture5.pdf
 
Last edited:
  • #8
It wasnt clear to me what ##s## is in the original post, though I now believe we're classically talking about ##s \in (-1,1)## -- though ##s \in (0,1) ## really is what is of interest -- as the original post appears to already using an Ordinary Generating Function, and hence the identity to be proven comes from the fact that by stochastic independence:

##\text{left hand side} = E\big[s^{X_1}\big]E\big[s^{X_2}\big] = E\big[s^{X_1}s^ {X_2}\big] = E\big[s^{X_1 + X_2}\big] = \text{right hand side}##

OP just needs to confirm that ##g(X) = s^{X}## is a random variable and that the transform doesn't change dependencies (the fact that generating functions are in principle invertible implies this)

- - - -
Equivalently, OP's question seems to be (while using OGFs) that the convolution of two Poissons with parameters ##\lambda ## and ##\mu## is a Poisson with parameter ##\lambda## and ##\mu##. There's a very elegant and probabilistic argument for this that uses memorylessness and the fact that there must be some constant ##\alpha \gt 0## where ##\mu \cdot t = (\alpha \lambda) \cdot t = \lambda \cdot (\alpha t)## ...

- - - -
a less probabilistic take would be to consider properties of the exponential function and simplify. E.g. for starters

##\sum_{n=0}^\infty s^n e^{-\lambda} \frac{\lambda^n}{n!} = e^{-\lambda} \big(\sum_{n=0}^\infty \frac{(s\lambda)^n}{n!}\big)=e^{-\lambda}\big(e^{s\lambda}\big) = e^{-\lambda + s\lambda}##

and apply this process to other parts of the original equation, then simplify.
 
Last edited:
  • #9
Ray Vickson said:
Does your formula ##(x+y)^n=\sum_{k=0}^n x^ky^{n-k}## work for n=2n = 2 or n=3?

Oops, it should be ##(x+y)^n=\sum_{k=0}^n \binom n k x^ky^{n-k}##.

StoneTemplePython said:
a less probabilistic take would be to consider properties of the exponential function and simplify. E.g. for starters

##\sum_{n=0}^\infty s^n e^{-\lambda} \frac{\lambda^n}{n!} = e^{-\lambda} \big(\sum_{n=0}^\infty \frac{(s\lambda)^n}{n!}\big)=e^{-\lambda}\big(e^{s\lambda}\big) = e^{-\lambda + s\lambda}##

and apply this process to other parts of the original equation, then simplify.

Oh, so ##G_{X+Y}(s)=G_X(s)G_Y(s)=(e^{-\lambda + s\lambda})(e^{-\mu + s\mu})=e^{(\lambda+\mu)(s-1)}=\sum_{n=0}^\infty s^n e^{-(\lambda+\mu)} \frac{(\lambda+\mu)^n}{n!}## implies that ##X+Y## has a distribution ##\text{Poiss}(\lambda+\mu)##?
 
  • Like
Likes StoneTemplePython
  • #10
Eclair_de_XII said:
Oh, so ##G_{X+Y}(s)=G_X(s)G_Y(s)=(e^{-\lambda + s\lambda})(e^{-\mu + s\mu})=e^{(\lambda+\mu)(s-1)}=\sum_{n=0}^\infty s^n e^{-(\lambda+\mu)} \frac{(\lambda+\mu)^n}{n!}## implies that ##X+Y## has a distribution ##\text{Poiss}(\lambda+\mu)##?

Yes. That's really all there is to it from the OGF standpoint. Since an OGF uses a power series in ##s## and the Poisson uses the power series for the exponential function, it should be an easy result.
 
  • #11
Thank you, everyone.
 

FAQ: Need help simplifying a summation with binomials

What is a summation with binomials?

A summation with binomials is an expression that involves adding together a series of terms that contain binomial coefficients. These coefficients represent the number of ways that a certain number of objects can be chosen from a larger set.

How do I simplify a summation with binomials?

To simplify a summation with binomials, you can use the binomial theorem or Pascal's triangle to expand the binomial coefficients into simpler terms. Then, you can combine like terms and use algebraic rules to simplify the expression further.

What are some common mistakes when simplifying a summation with binomials?

Some common mistakes when simplifying a summation with binomials include forgetting to distribute the exponent to each term, incorrectly expanding the binomial coefficients, and making errors when combining like terms. It is important to carefully check each step and use algebraic rules correctly.

Can you provide an example of simplifying a summation with binomials?

Yes, for example, if you have the summation ∑(n choose k) from k = 0 to n, you can use the binomial theorem to expand the binomial coefficients and then combine like terms to simplify the expression to 2^n.

Why is it important to learn how to simplify a summation with binomials?

Learning how to simplify a summation with binomials is important because it allows you to solve more complex math problems involving combinations and permutations. It also helps you develop your algebraic skills and understand the properties of binomial coefficients.

Similar threads

Replies
9
Views
923
Replies
11
Views
1K
Replies
10
Views
2K
Replies
14
Views
2K
Replies
6
Views
933
Replies
2
Views
2K
Replies
3
Views
1K
Back
Top