If Ʃa_n converges, with a_n > 0, then Ʃ(a_n)^2 always converges

  • Thread starter DotKite
  • Start date
In summary, if Ʃa_n converges with a_n > 0, then Ʃ(a_n)^2 always converges by using the comparison test, since (a_n)^2 < a_n when 0 < a_n < 1. This can be assumed because for some n ≥ N, a_n is in the interval (0,1) due to the convergence of Ʃa_n.
  • #1
DotKite
81
1

Homework Statement


if Ʃa_n converges, with a_n > 0, then Ʃ(a_n)^2 always converges


Homework Equations



n/a

The Attempt at a Solution



I am at a complete loss. I have tried using partial sums, cauchy criterion, and I tried using ratio test which seems to work but I am not sure.

Since Ʃa_n converges then by ratio test

lim n->∞ a_n+1 / a_n < 1

Now we apply ratio test to Ʃ(a_n)^2

lim n→∞ (a_n+1)^2 / (a_n)^2 = (lim n→∞ a_n+1 / a_n)^2 < 1^2 = 1

Thus by ratio test Ʃ(a_n)^2 converges.

This working did not utilize the condition a_n > 0, so it seems suspect to me.
 
Physics news on Phys.org
  • #2
Hint: if ##0 < a_n < 1##, then ##a_n^2 < a_n##.
 
  • #3
jbunniii said:
Hint: if ##0 < a_n < 1##, then ##a_n^2 < a_n##.

Ok so you use comparison test right? since (a_n)^2 < a_n and a_n converges then (a_n)^2 converges as well, but is it correct to assume 0<a_n<1? Can you assume that because since
Ʃa_n converges that implies a_n → 0 and for some n≥N a_n is in (0,1)?
 
  • #4
DotKite said:
Ok so you use comparison test right? since (a_n)^2 < a_n and a_n converges then (a_n)^2 converges as well, but is it correct to assume 0<a_n<1? Can you assume that because since
Ʃa_n converges that implies a_n → 0 and for some n≥N a_n is in (0,1)?
Yes, that's right, or putting it more precisely, there exists an ##N## such that ##a_n \in (0,1)## for all ##n \geq N##. And then use the comparison test.
 
  • #5
thanks a bunch J
 

FAQ: If Ʃa_n converges, with a_n > 0, then Ʃ(a_n)^2 always converges

Does the convergence of Ʃa_n guarantee the convergence of Ʃ(a_n)^2?

Yes, if Ʃa_n converges, then Ʃ(a_n)^2 always converges. This is known as the convergence of a series of squares theorem.

Why does Ʃ(a_n)^2 always converge if a_n > 0?

When a_n > 0, the series Ʃ(a_n)^2 is a non-decreasing sequence. This means that the terms of the sequence are always increasing or staying the same. A non-decreasing sequence always converges or diverges to infinity, and since we know that Ʃa_n converges, Ʃ(a_n)^2 must also converge.

Can you provide an example of a convergent series Ʃa_n but a divergent series Ʃ(a_n)^2?

No, this is not possible. The convergence of Ʃa_n implies the convergence of Ʃ(a_n)^2, so if Ʃa_n converges, then Ʃ(a_n)^2 must also converge.

What if a_n is a negative number?

The convergence of Ʃ(a_n)^2 only holds when a_n > 0. If a_n is negative, the series Ʃ(a_n)^2 may diverge. For example, the series Ʃ(-1)^n diverges, but Ʃ((-1)^n)^2 = Ʃ1 converges.

How can we use the convergence of a series of squares theorem in mathematical proofs?

The convergence of a series of squares theorem is a useful tool for proving the convergence of certain series. It can be used to show that a series converges by showing that the original series Ʃa_n is convergent, and then using the convergence of Ʃ(a_n)^2 to prove that Ʃa_n converges as well.

Similar threads

Replies
1
Views
1K
Replies
2
Views
1K
Replies
5
Views
957
Replies
5
Views
1K
Replies
3
Views
1K
Replies
4
Views
782
Back
Top