How does this sequence converge? This makes no sense

  • MHB
  • Thread starter shamieh
  • Start date
  • Tags
    Sequence
In summary: The limit of the sequence $\displaystyle \begin{align*} a_n \end{align*}$ is defined as the value that the terms of the sequence approach as $\displaystyle \begin{align*} n \end{align*}$ approaches infinity. In this case, the two sequences $\displaystyle a_n$ and $\displaystyle b_n$ approach the same value as n approaches infinity, which is the value of $\displaystyle \begin{align*} e^x \end{align*}$. Therefore, the two definitions are equivalent and both valid.
  • #1
shamieh
539
0
determine whetehr the sequence diverges or converges, if converges find its limit

\(\displaystyle \displaystyle a_n = (1 + \frac{2}{n})^n\)

my teacher is trying to tell me this converges to e^2 ... HOW?
 
Physics news on Phys.org
  • #2
Let's look at the more general limit:

\(\displaystyle L=\lim_{x\to\infty}\left[\left(1+\frac{r}{x}\right)^x\right]\)

Now, if we take the natural long of both sides, we obtain:

\(\displaystyle \ln(L)=\lim_{x\to\infty}\left[\frac{\ln\left(1+\frac{r}{x}\right)}{\frac{1}{x}}\right]\)

On the right, we have the indeterminate form 0/0, so application of L'Hôpital's rule yields:

\(\displaystyle \ln(L)=\lim_{x\to\infty}\left[\frac{\frac{1}{1+\frac{r}{x}}\left(-\frac{r}{x^2}\right)}{-\frac{1}{x^2}}\right]=\lim_{x\to\infty}\left[\frac{rx}{x+r}\right]=r\)

Hence:

\(\displaystyle L=e^r\)
 
  • #3
shamieh said:
determine whetehr the sequence diverges or converges, if converges find its limit

\(\displaystyle \displaystyle a_n = (1 + \frac{2}{n})^n\)

my teacher is trying to tell me this converges to e^2 ... HOW?

Generally speaking, Euler's number $\displaystyle \begin{align*} e \end{align*}$ is DEFINED either as $\displaystyle \begin{align*} \lim_{h \to 0} { \left[ \left( 1 + h \right) ^{\frac{1}{h}} \right] } \end{align*}$ or $\displaystyle \begin{align*} \lim_{k \to \infty}{ \left[ \left( 1 + \frac{1}{k} \right) ^k \right] } \end{align*}$

(which makes sense, they're really the same limit just with $\displaystyle \begin{align*} h = \frac{1}{k} \end{align*}$, as when $\displaystyle \begin{align*} h \to 0, k \to \infty \end{align*}$ )

So in your case, where you are trying to evaluate $\displaystyle \begin{align*} \lim_{n \to \infty} { \left[ \left( 1 + \frac{2}{n} \right) ^n \right] } \end{align*}$, notice that

$\displaystyle \begin{align*} \lim_{n \to \infty} { \left[ \left( 1 + \frac{2}{n} \right) ^n \right] } &= \lim_{n \to \infty} { \left\{ \left[ \left( 1 + \frac{2}{n} \right) ^{\frac{n}{2}} \right] ^2 \right\} } \\ &= \left\{ \lim_{n \to \infty} { \left[ \left( 1 + \frac{2}{n} \right) ^{\frac{n}{2}} \right] } \right\} ^2 \end{align*}$

So now if we let $\displaystyle \begin{align*} k = \frac{n}{2} \end{align*}$, notice that as $\displaystyle \begin{align*} n \to \infty, k \to \infty \end{align*}$, so we have

$\displaystyle \begin{align*} \left\{ \lim_{n \to \infty}{ \left[ \left( 1 + \frac{2}{n} \right) ^{\frac{n}{2}} \right] } \right\} ^2 &= \left\{ \lim_{k \to \infty}{ \left[ \left( 1 + \frac{1}{k} \right) ^k \right] } \right\} ^2 \\ &= e^2 \end{align*}$
 
  • #4
In the year 1730 Leonhard Euler defined the exponential and the logarithm function as...

$\displaystyle e^{x} = \lim_{n \rightarrow \infty} (1 + \frac{x}{n})^{n}\ (1)$

$\displaystyle \ln x = \lim_{n \rightarrow \infty} n\ (x^{\frac{1}{n}} - 1)\ (2)$

... and also demonstrated that the two functions are inverse to one another. In my opinion these definitions are 'the only right definitions', mainly because they are valid if x is a complex variable...Kind regards$\chi$ $\sigma$
 
  • #5
chisigma said:
In the year 1730 Leonhard Euler defined the exponential and the logarithm function as...

$\displaystyle e^{x} = \lim_{n \rightarrow \infty} (1 + \frac{x}{n})^{n}\ (1)$

$\displaystyle \ln x = \lim_{n \rightarrow \infty} n\ (x^{\frac{1}{n}} - 1)\ (2)$

... and also demonstrated that the two functions are inverse to one another. In my opinion these definitions are 'the only right definitions', mainly because they are valid if x is a complex variable...Kind regards$\chi$ $\sigma$

Well (1) is certainly unambiguous if $x \in \Bbb C$, but (2) is not...because complex exponentiation is a bit dicey:

$x^{\frac{1}{n}}$ is not a (single-valued) function, in general, on the complex plane, we cannot speak of THE $n$-th root, but have to pick a PARTICULAR $n$-th root, and there is no well-defined way to do this for all $n$.

On the real numbers the two functions ARE inverses, but on the complex plane, the complex exponential HAS NO ENTIRE INVERSE.

This is important, and not realizing this can lead to errors.
 
  • #6
Deveno said:
$x^{\frac{1}{n}}$ is not a (single-valued) function, in general, on the complex plane, we cannot speak of THE $n$-th root, but have to pick a PARTICULAR $n$-th root, and there is no well-defined way to do this for all $n$...

According to De Moivre theorem a complex variable z can be written as $\displaystyle z = r\ (\cos \theta + i\ \sin \theta)$ and one of the n-th roots of z can be written as... $\displaystyle z^{\frac{1}{n}} = r^{\frac{1}{n}}\ (\cos \frac{\theta + 2\ k\ \pi}{n} + i\ \sin \frac{\theta + 2\ k\ \pi}{n}),\ k=0,1,...,n-1\ (1)$

Setting in (1) k=0 You obtain...

$\displaystyle z^{\frac{1}{n}} = r^{\frac{1}{n}}\ (\cos \frac{\theta}{n} + i\ \sin \frac{\theta}{n})\ (2)$

... and this root is called principal n-th root of z. If you use the principal n-th root of z in the Euler's definition of logarithm any ambiguity disappears and the exponential and logarithm functions are inverse one to another. For details see...

http://www.cliffsnotes.com/math/trigonometry/polar-coordinates-and-complex-numbers/de-moivres-theorem

Kind regards

$\chi$ $\sigma$

P.S. It is well known to me that most of 'Holybooks' don't agree with what I have written, so that it isn't necessary to discuss about this point...
 
  • #7
chisigma said:
In the year 1730 Leonhard Euler defined the exponential and the logarithm function as...

$\displaystyle e^{x} = \lim_{n \rightarrow \infty} (1 + \frac{x}{n})^{n}\ (1)$

$\displaystyle \ln x = \lim_{n \rightarrow \infty} n\ (x^{\frac{1}{n}} - 1)\ (2)$

... and also demonstrated that the two functions are inverse to one another. In my opinion these definitions are 'the only right definitions', mainly because they are valid if x is a complex variable...Kind regards

$\chi$ $\sigma$

Euler also defined $\displaystyle \begin{align*} e^x = \sum_{n = 0}^{\infty} \frac{x^n}{n!} \end{align*}$, valid for all complex x. Namely, he did it to get a value for e.

How is this any less "right" or "valid" than the definition you gave?
 
  • #8
Prove It said:
Euler also defined $\displaystyle \begin{align*} e^x = \sum_{n = 0}^{\infty} \frac{x^n}{n!} \end{align*}$, valid for all complex x. Namely, he did it to get a value for e.

How is this any less "right" or "valid" than the definition you gave?

It is comfortable to demonstrate that the two sequences...

$\displaystyle a_{n} = (1 + \frac{x}{n})^{n}\ (1)$

$\displaystyle b_{n} = \sum_{k=0}^{n} \frac{x^{k}}{k!}\ (2)$

... have the same limit for n tending to infinity, so that the two definitions seem to be equivalent. From my point of view however the Euler's 1730 definition is the only to be 'rigorous'...Kind regards $\chi$ $\sigma$
 
  • #9
chisigma said:
It is comfortable to demonstrate that the two sequences...

$\displaystyle a_{n} = (1 + \frac{x}{n})^{n}\ (1)$

$\displaystyle b_{n} = \sum_{k=0}^{n} \frac{x^{k}}{k!}\ (2)$

... have the same limit for n tending to infinity, so that the two definitions seem to be equivalent. From my point of view however the Euler's 1730 definition is the only to be 'rigorous'...Kind regards $\chi$ $\sigma$

One can, no less rigorously, define $e$ to be the real number $a$ such that:

$\displaystyle \int_1^a \dfrac{1}{t} = 1$.

Also, I believe that the first definition is due to Jacob Bernoulli who discovered it investigating questions of continuously compounded interest, circa 1690.

It is likely that John Napier was aware of $e$ as $\ln(1)$ as referenced in a table published in 1618. It is, in my opinion, quite natural to seek a continuous function such that:

$f(xy) = f(x) + f(y)$

and for any such function the pre-image of 1 is of some interest (it determines the "scaling", what we call the "base').

Your reference to DeMoivre's Theorem has the same problem (with respect to functional inversion) that the trigonometric functions do: one must restrict $\theta$ (in this case, to a subinterval of the reals of length $2\pi$).

So, for example, $\log(z)$ cannot be defined to be a continuous function on the entire unit circle that extends the real function, we have to make a "branch cut" somewhere. USUALLY, this is done along the negative real axis, which corresponds to picking the interval $(-\pi,\pi]$ for $\theta$. I note dryly that your original "limit" for $\log(x)$ does not exist for $x = -1$ (which is certainly a complex number), although "naively" one would think:

$\log(-1) = i\pi$ since $e^{i\pi} = -1$.

The point being, it seems a bit paradoxical to insist a particular definition is the only "rigorous" one, and then display a result which lacks a certain rigor (one facet of rigor is properly qualifying the necessary pre-conditions, and possible exceptions).

I am aware that choosing a "principal branch" settles most of these questions, and serves for many applications. Perhaps that is what you meant.
 
  • #10
Deveno said:
... for example $\log(z)$ cannot be defined to be a continuous function on the entire unit circle that extends the real function, we have to make a 'branch cut' somewhere...

In...

http://mathhelpboards.com/calculus-10/improper-integral-involving-ln-6103-post28032.html#post28032

... it has been demonstrated that such statement is incorrect... in the example I've reported the continuity of log z along the entire unit circle is necessary to avoid the 'impossible identity' $- i \pi^{2} = 0$..Kind regards$\chi$ $\sigma$
 
  • #11
Deveno said:
I note dryly that your original "limit" for $\log(x)$ does not exist for $x = -1$ (which is certainly a complex number), although 'naively' one would think:

$\log(-1) = i\pi$ since $e^{i\pi} = -1$

I'm afraid that the 'original limit' exists for all x different from 0. In the case x=-1 the De Moivre notation is $-1 = \cos \pi + i\ \sin \pi$ and the limit becomes...$\displaystyle \ln (-1) = \lim_{n \rightarrow \infty} n\ (\cos \frac{\pi}{n} + i\ \sin \frac{\pi}{n} - 1) = \lim_{x \rightarrow 0} \frac{e^{i\ \pi\ x} -1}{x} = i\ \pi$

Kind regards$\chi$ $\sigma$
 
  • #12
chisigma said:
I'm afraid that the 'original limit' exists for all x different from 0. In the case x=-1 the De Moivre notation is $-1 = \cos \pi + i\ \sin \pi$ and the limit becomes...$\displaystyle \ln (-1) = \lim_{n \rightarrow \infty} n\ (\cos \frac{\pi}{n} + i\ \sin \frac{\pi}{n} - 1) = \lim_{x \rightarrow 0} \frac{e^{i\ \pi\ x} -1}{x} = i\ \pi$

Kind regards$\chi$ $\sigma$

I agree that limit exists, but what you originally posted was:

chisigma said:
In the year 1730 Leonhard Euler defined the exponential and the logarithm function as...

$\displaystyle e^{x} = \lim_{n \rightarrow \infty} (1 + \frac{x}{n})^{n}\ (1)$

$\displaystyle \ln x = \lim_{n \rightarrow \infty} n\ (x^{\frac{1}{n}} - 1)\ (2)$

... and also demonstrated that the two functions are inverse to one another. In my opinion these definitions are 'the only right definitions', mainly because they are valid if x is a complex variable...Kind regards$\chi$ $\sigma$

My deepest apologies, I do not mean to be rude, nor did I intend to derail the thread (which I apparently have), I merely wanted to point out there are some pitfalls associated with taking those statements at face value. In the link you gave you are evidently aware that we need to avoid "some" part of the unit circle when doing a contour integral involving $\log$. Which part we avoid is a choice we make, sometimes one "cut" is preferable to another.

The complex exponential is "imaginary periodic", so we have to limit inversion to a horizontal strip of height < $2\pi$. I see this a lot in questions involving trigonometry, often people find "some solutions" but not "all solutions" (they miss the ones due to periodicity, because they only think of angles "going once around the circle").

Put another way, if we start with a complex number $z$ whose argument puts it close to, but under, the real axis, and take its $n$-th power, the complex number we find when we take $(z^n)^{1/n}$ will most assuredly not be the original complex number we started with, if we take our angles to be in $[0,2\pi)$.

I feel it is important to note this caveat, that one of the consequences of extending to the complex plane is we lose "orderedness" which means we have no real way to tell "which primitive $n$-th root is the 'principal' one" (algebraically, anyone will do). One can remedy this "somewhat" by using so-called "polar form", but this ordering doesn't behave well with respect to addition (and it has some peculiarities to be aware of in multiplication).

Writing $z = re^{i\theta}$ has its uses, don't get me wrong. But it has its drawbacks, too, one of them being that the form for the complex number 0 is "ambiguous". The identification is:

$\Bbb C - \{0\} \iff \Bbb R^+ \times \Bbb R/(2\pi\Bbb Z)$

and the "mod $2\pi$" part is important to remember.
 

FAQ: How does this sequence converge? This makes no sense

What is the definition of convergence in a sequence?

Convergence in a sequence means that as the sequence continues, the terms are getting closer and closer to a single value, known as the limit.

How do I know if a sequence is convergent or divergent?

A sequence is convergent if the terms approach a finite limit as n approaches infinity. If the terms do not approach a limit, the sequence is divergent.

What is the difference between a convergent and a divergent sequence?

A convergent sequence has a finite limit, while a divergent sequence does not have a limit. In other words, a convergent sequence approaches a specific value, while a divergent sequence does not have a specific value that it approaches.

How can I tell if a sequence is converging or diverging?

You can tell if a sequence is converging or diverging by looking at the behavior of the terms as n approaches infinity. If the terms are getting closer and closer to a single value, the sequence is converging. If the terms are getting farther and farther away from each other, the sequence is diverging.

What are some common examples of convergent and divergent sequences?

Some common examples of convergent sequences include geometric sequences, where the terms approach a specific value, and telescoping series, where the terms cancel each other out and approach a finite limit. Divergent sequences can include harmonic sequences, where the terms get closer to 0 but never reach it, and alternating sequences, where the terms switch between positive and negative values but do not approach a specific value.

Similar threads

Replies
3
Views
1K
Replies
2
Views
2K
Replies
3
Views
304
Replies
17
Views
3K
Replies
5
Views
598
Replies
2
Views
1K
Back
Top