Proving Convergence: Limit of a Sequence with Conditions

  • Thread starter MathematicalPhysicist
  • Start date
  • Tags
    Convergence
In summary: The proof of this fact is based on the squeeze theorem.In summary, the conversation discusses how to prove that the limit of a sequence, a_n, which satisfies lim (a_n)^1/n<1 as n approaches infinity and is greater than or equal to 0 for every n, is equal to 0. One approach is to choose a small value for e such that a + e < 1 and use the sandwich theorem to prove that a_n is smaller than 1/n. Another approach is to choose e=1/a-1 and use the fact that lim(c)^n = 0 as n goes to infinity if |c| < 1, which can be proven using the squeeze theorem.
  • #1
MathematicalPhysicist
Gold Member
4,699
373
let a_n be a sequence which satisfies lim (a_n)^1/n<1 as n appraoches infinity, a_n>=0 for every n. prove that lim a_n=0.
what i did is as follows, for every e>0 there exists n0 such that for every n>=n0 |a_n^1/n-a|<e.
then 0=<a_n<(a+e)^n
we get a<1 a_n<(1+e)^n
but how do i procceed from here?
 
Physics news on Phys.org
  • #2
loop quantum gravity said:
let a_n be a sequence which satisfies lim (a_n)^1/n<1 as n appraoches infinity, a_n>=0 for every n. prove that lim a_n=0.
what i did is as follows, for every e>0 there exists n0 such that for every n>=n0 |a_n^1/n-a|<e.
You should mention what a is (the limit of {(a_n)^1/n})
then 0=<a_n<(a+e)^n
Remember, you can choose e to be whatever you like, and you know that a < 1. So why not choose e to be small enough that a + e < 1? Then use the sandwich theorem.
 
  • #3
i thought about this, but even then i get that 0<=a_n<1^n, i also thought to write: e=1/a-1>0 but 1/a^n doesn't converge.
perhaps i should write 0<=a_n<1/n<=1, but no one guarntees us that a_n is smaller than 1/n, a_n could be equal 2/n.
 
  • #4
ok i think i got it (a+e)^n appraoches 0 when e is small enough.
 
  • #5
loop quantum gravity said:
ok i think i got it (a+e)^n appraoches 0 when e is small enough.
Yes, in fact you want e small enough so that a + e < 1 as Ortho mentioned. The reason for this is because one can prove that if c<1 then lim(c)^n = 0 as n goes to infinity.
 
  • #6
It is too late to edit the above post, so I would just like to add a correction. There should be an absolute value sign around c, so it should say if |c| < 1, then lim(c)^n = 0.
 

FAQ: Proving Convergence: Limit of a Sequence with Conditions

What is the definition of convergence in the context of a sequence?

Convergence in the context of a sequence means that the terms of the sequence approach a specific value as the number of terms in the sequence increases. This specific value is known as the limit of the sequence.

What are the conditions for proving the convergence of a sequence?

The conditions for proving the convergence of a sequence are that the sequence must be bounded and monotonic. This means that the terms of the sequence must not exceed a certain value and must either always increase or always decrease.

How do you determine the limit of a convergent sequence?

To determine the limit of a convergent sequence, you can use the formula lim(n→∞) an = L, where "an" is the sequence and "L" is the limit. This formula essentially states that as the number of terms in the sequence approaches infinity, the terms will approach the limit value.

What is the significance of the limit of a convergent sequence?

The limit of a convergent sequence is significant because it represents the ultimate behavior of the sequence. It tells us what value the terms of the sequence will approach as the number of terms increases, and it helps us understand the overall pattern and behavior of the sequence.

Can a sequence converge without having a limit?

No, a sequence cannot converge without having a limit. The definition of convergence requires that the terms of the sequence approach a specific value as the number of terms increases, and this specific value is known as the limit. If a sequence does not have a limit, it cannot be considered convergent.

Similar threads

Back
Top