- #1
japplepie
- 93
- 0
Given a sequence, how to check if it converges?
Assume the sequence is monotonic but the formula that created the sequence is unknown.
My first thought was if:
seq(n+2) - seq(n+1) < seq(n+1) - seq(n) , is always true as n->infinity then it is convergent.
Or in other words, if the difference between two consecutive terms gets smaller and smaller, then it converges.
But it's not always true, an example of this is when seq(n) = log(n), where the difference between terms gets smaller and smaller, but it still diverges.
So my question is, is there a limit to how small the difference between consecutive has to be for it to converge?
Like, the difference between one term must be 1/2 the difference between the next term, etc
Assume the sequence is monotonic but the formula that created the sequence is unknown.
My first thought was if:
seq(n+2) - seq(n+1) < seq(n+1) - seq(n) , is always true as n->infinity then it is convergent.
Or in other words, if the difference between two consecutive terms gets smaller and smaller, then it converges.
But it's not always true, an example of this is when seq(n) = log(n), where the difference between terms gets smaller and smaller, but it still diverges.
So my question is, is there a limit to how small the difference between consecutive has to be for it to converge?
Like, the difference between one term must be 1/2 the difference between the next term, etc