- #1
JG89
- 728
- 1
As far as I understand, a sequence converges if and only if it is Cauchy. So say for some sequence a_n and for all epsilon greater than zero we have [tex] |a_n - a_{n+1}| < \epsilon [/tex] for large enough n.
We could then say a_n converges if and only if [tex] \lim_{n \rightarrow \infty} a_n - a_{n+1} = 0 [/tex].
But what about if a_n = ln(n)?
[tex] ln(n) - ln(n+1) = ln(n/(n+1)) [/tex] so for n tending to infinity ln(n) - ln(n+1) goes to 0. So I should be able to say that the sequence converges, but ln(n) obviously goes to infinity for increasing n.
What's the mistake in my reasoning?
We could then say a_n converges if and only if [tex] \lim_{n \rightarrow \infty} a_n - a_{n+1} = 0 [/tex].
But what about if a_n = ln(n)?
[tex] ln(n) - ln(n+1) = ln(n/(n+1)) [/tex] so for n tending to infinity ln(n) - ln(n+1) goes to 0. So I should be able to say that the sequence converges, but ln(n) obviously goes to infinity for increasing n.
What's the mistake in my reasoning?