Proving Convergence of a Sequence from a Convergent Series

In summary, if the infinite series a_1 + a_2 + ... + a_v converges to a value A and s_n = a_1 + a_2 + a_3 + ... + a_n, then the sequence (s_1 + s_2 + ... + s_N)/N also converges to A, and the difference between the two can be made smaller than any given multiple of epsilon by choosing a large enough M.
  • #1
JG89
728
1

Homework Statement



Prove that if the infinite series a_1 + a_2 + ... + a_v converges to a value A and s_n = a_1 + a_2 + a_3 + ... + a_n, then the sequence:

(s_1 + s_2 + ... + s_N)/N also converges, and has the limit A.


Homework Equations





The Attempt at a Solution



Since s_n represents the n'th partial sum, then
s_1 + s_2 + s_3 + ... s_N = a_1 + (a_1 + a_2) + (a_1 + a_2 + a_3) + ...+ (a_1 + ... + a_N)

=

N*a_1 + (N-1)*a_2 + (N-2)a_3 + ... + a_N

So our sequence looks like


(N*a_1 + (N-1)*a_2 + (N-2)a_3 + ... + a_N)/N

Notice that the terms in front of the a_i form a monotonic decreasing sequence which converges to 0: N/N = 1, (N-1)/N, (N-2)/N, ... , 1/N

So now I use Abel's Test: "Let a_1 + a_2 + ... be an infinite series whose partial sums are bounded independent of n. Let p_1, p_2, ... be a sequence of positive numbers decreasing monotonically to the value 0, then the infinite series p_1*a_1 + p_2*a_2 + ... converges"

Since my original series converges, its partial sums are bounded, and the terms in front of the a_i form a monotonically decreasing sequence going to 0, just like I said, and so the infinite series converges...

I'm just having trouble proving that it converges to the value A. I have no idea where to start.
 
Physics news on Phys.org
  • #2
(N*a_1 + (N-1)*a_2 + (N-2)a_3 + ... + a_N)/N

Notice that the terms in front of the a_i form a monotonic decreasing sequence which converges to 0: N/N = 1, (N-1)/N, (N-2)/N, ... , 1/N

So now I use Abel's Test: "Let a_1 + a_2 + ... be an infinite series whose partial sums are bounded independent of n. Let p_1, p_2, ... be a sequence of positive numbers decreasing monotonically to the value 0, then the infinite series p_1*a_1 + p_2*a_2 + ... converges"

That's not right. You don't have set [itex]p_i[/itex] for each [itex]a_i[/itex] instead they're changing and actually increasing. Maybe you can make it work with some fiddling but right now, you don't actually have an infinite series being multiplied by set decreasing numbers

You might have better luck noticing that

[tex]\lim_{N \rightarrow \infty} A - \sum_{n=0}^{N}a_n = 0[/tex] and then try to compare that to [tex]A - \frac{s_1 + s_2... + s_N}{N}[/tex]
 
  • #3
I just can't see where to go with that...

I've tried to bound [tex] A - \frac{s_1 + s_2... + s_N}{N} [/tex] above by
[tex]
A - \sum_{n=0}^{N}a_n
[/tex] but have had no luck. Thinking about it more, it being bounded above by that probably depends on whether the series has both positive and negative terms, and how many of each, so it's better for me to abandon that idea...
 
  • #4
You know the s_i->A. So for every epsilon there is an M such that |s_i-A|<epsilon for all i>M, right? Let B=sum(s_i for i=1 to M). Look at S_N=(B+s_{M+1}+...+s_N)/N (which is just your original sum). Can you show that for N sufficiently large that |S_N-A|<2*epsilon?
 
  • #5
I'll get back to this question tomorrow. I've been thinking about it the last two days, I need a break.
 
  • #6
I've been thinking about it, and this is what I have. For every positive epsilon there exists a positive integer M such that |s_i - A| < epsilon. Let epsilon_i be the smallest possible epsilon (I know there is no such thing, but take it small enough so that epsilon_i - |s_i - A| = 0.000000001) for |s_i - A|

Now [tex] \frac{|s_1 - A| + |s_2 - A| + ... + |s_N - A|}{N} \ge \frac{|s_1 - A + s_2 - A + ... + s_N - A|}{N} = \frac{|s_1 + ... + s_N - NA|}{N} = | \frac{s_1 + ... + s_N}{N} - A| [/tex].

We also know that |s_1 - A| < epsilon_1, ..., |s_N - A| < epsilon_N, so, [tex] \frac{|s_1 - A| + |s_2 - A| + ... + |s_N - A|}{N} < \frac{ \epsilon_1 + \epsilon_2 + ... + \epsilon_N}{N} [/tex] and so we have:

[tex] | \frac{s_1 + ... + s_N}{N} - A| < \frac{ \epsilon_1 + \epsilon_2 + ... + \epsilon_N}{N} [/tex].

That's all I could come up with...
 
  • #7
That is a try, but not a very good one. Look back at my hint in post 4. Don't try to control the value of s_i for i<=M. They are completely out of control. They could be anything. But their sum is B. And you do know |s_i-A|<epsilon for i>M. Control each part separately. Hint: (N-M)/N goes to 1 as N->infinity and B/N goes to zero.
 
Last edited:
  • #8
It seems to me like you're saying that M is a fixed number. Surely M has to increase to infinity as epsilon gets smaller and smaller?
 
  • #9
Absolutely right. But first fix an epsilon and prove the difference between your sum and A can be made less than some multiple of epsilon. Sure, M will depend on epsilon in the end. But it gets really confusing if you take all of the limits at the same time. Do them one by one.
 
Last edited:
  • #10
Okay, here is what I've got:

Let's epsilon be a fixed positive value, and let M be the positive integer such that |s_i - A| < epsilon if i > M.

[tex] | \frac{B + s_{m+1} + ... + s_N - (N-M)A}{N} | = | \frac{B + s_{m+1} - A + s_{m+2} - A + ... + s_N - A}{N} | \le \frac{|B| + |s_{m+1} - A| + |s_{m+2} - A| + ... + |s_N - A|}{N} [/tex].

Remembering that |s_{m+1} - A| < epsilon, |s_{m+2} - A| < epsilon, ... , |s_N - A| < epsilon, we now have:

[tex] \frac{|B| + |s_{m+1} - A| + |s_{m+2} - A| + ... + |s_N - A|}{N} < \frac{|B| + \epsilon + ... + \epsilon}{N} = \frac{|B|}{N} + \frac{(N-M) \epsilon}{N} [/tex].

Now, (N-M)/N = 1 - M/N < 1 => (N-M)(epsilon)/N < epsilon. Also, B is just the finite sum of the s_i up to i = M (remember that M is fixed right now), and so |B/N - 0| = |B/N| < epsilon*
for all positive epsilon*, provided N is large enough.

Now we have: [tex] \frac{|B| + \epsilon + ... + \epsilon}{N} = \frac{|B|}{N} + \frac{(N-M) \epsilon}{N} < \epsilon* + \epsilon < max(2 \epsilon*, 2 \epsilon) [/tex]

Implying that:

[tex] | \frac{B + s_{m+1} + ... + s_N - (N-M)A}{N} |= | \frac{B + s_{m+1} + ... + s_N}{N} - (1 - \frac{M}{N} ) A| < max(2 \epsilon*, 2 \epsilon) [/tex].

1 - M/N goes to 1 as N goes to infinity, then (1 - M/N)A goes to A, and so we can always make my original sum within the (fixed) 2*epsilon distance of a value that is approaching A. However, we can do this with any sized 2*epsilon we please, just provided we fix M large enough and let N increase beyond all bounds. And so the limit of the original sum is equal to the limit of (N-M)A/N = (1-M/N)A which is A.
 
Last edited:
  • #11
You don't want the max of two epsilons there. The idea is pick a N such that |B/N|<epsilon and since the other term is <epsilon, then your whole sum is less than 2*epsilon. Since epsilon is arbitrary you are all done, right?
 
  • #12
Finally...that was pretty difficult...

Just a question, is it really valid that I said my sum is always within an epsilon distance of (1-M/N)A and since that is always within an arbitrary small distance of A (provided N is large enough) then the sum of my series is A?

I've never seen that done before, so I was not too confident in my answer
 
  • #13
Well, yeah. The idea was to show the difference between (s_1 + s_2 + ... + s_N)/N and A can be made arbitrarily small for sufficiently large N, right? Isn't that what convergence means?
 
  • #14
Yup, that's exactly what it means. Thanks for the help :)
 

FAQ: Proving Convergence of a Sequence from a Convergent Series

What is a series and sequence proof?

A series and sequence proof is a mathematical proof that involves proving the convergence or divergence of a series or sequence. It is used to determine the behavior of a series or sequence as the number of terms increases.

How is a series and sequence proof different from other mathematical proofs?

A series and sequence proof is unique in that it deals specifically with the convergence or divergence of a series or sequence, rather than a general mathematical concept. It also often involves using specific techniques and tests, such as the ratio or root test, to prove the behavior of the series or sequence.

What is the purpose of a series and sequence proof?

The purpose of a series and sequence proof is to determine the behavior of a series or sequence as the number of terms increases. This can be useful in many areas of mathematics, such as in analyzing the convergence of algorithms or in understanding the behavior of mathematical functions.

What are some common techniques used in series and sequence proofs?

Some common techniques used in series and sequence proofs include the ratio and root tests, comparison tests, and the integral test. These tests help to determine the convergence or divergence of a series or sequence by comparing it to a known series or function.

Why are series and sequence proofs important?

Series and sequence proofs are important because they allow us to understand the behavior of a series or sequence as the number of terms increases. This can help us to make predictions and draw conclusions about mathematical concepts and functions, and is essential in many areas of mathematics and science.

Similar threads

Back
Top