Sequence of functions : pointwise & uniform convergence

In summary, the conversation discusses a sequence of functions defined on the interval $[0,+\infty)$ and shows that the sequence converges pointwise to 0. It also calculates the maximum of the functions and concludes that the sequence converges uniformly on $[a,+\infty)$ for $a>0$ if and only if $\alpha<1$ in the exponent of the functions. The conversation also addresses confusion about the use of the variables $a$ and $\alpha$ and discusses the behavior of the maximum of the functions as $n$ increases. Finally, it concludes that the sequence converges uniformly if and only if $\alpha<1$.
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! 😊

Let $0<\alpha \in \mathbb{R}$ and $(f_n)_n$ be a sequence of functions defined on $[0, +\infty)$ by: \begin{equation*}f_n(x)=n^{\alpha}xe^{-nx}\end{equation*} - Show that $(f_n)$ converges pointwise on $[0,+\infty)$.

For an integer $m>a$ we have that \begin{equation*}0 \leq n^{\alpha}xe^{-nx} \leq n^{m}xe^{-nx}\end{equation*}
For $x> 0$ we have that \begin{equation*}\lim_{n\rightarrow +\infty}f_n(x)=\lim_{n\rightarrow +\infty}n^{m}xe^{-nx}=\lim_{n\rightarrow +\infty}\frac{n^{m}x}{e^{nx}}\ \overset{m\text{-times De L'Hopital}}{=}\ \lim_{n\rightarrow +\infty}\frac{m!}{x^{m-1}e^{nx}}=0\end{equation*} For $x= 0$ we have that \begin{equation*}\lim_{n\rightarrow +\infty}f_n(0)=\lim_{n\rightarrow +\infty}0=0\end{equation*} So we get \begin{equation*}\lim_{n\rightarrow +\infty}0 \leq \lim_{n\rightarrow +\infty}n^{\alpha}xe^{-nx} \leq \lim_{n\rightarrow +\infty}n^{m}xe^{-nx} \Rightarrow \lim_{n\rightarrow +\infty}0 \leq \lim_{n\rightarrow +\infty}n^{\alpha}xe^{-nx} \leq 0\end{equation*} It follows that $\displaystyle{\lim_{n\rightarrow +\infty}n^{\alpha}xe^{-nx}=0}$.

Therefore $f_n(x)$ converges pointwise to $0$.

Is everything correct? :unsure:
- Calculate $\max_{x\in [0, +\infty)}f_n(x)$ and conclude that $f_n$ converges uniformly on $[a, +\infty)$ for $a>0$.

We have that
\begin{align*}&f_n(x)=n^{\alpha}xe^{-nx}\\ &\rightarrow f_n'(x)=n^{\alpha}e^{-nx}-n^{\alpha+1}xe^{-nx}=\left (n^{\alpha}-n^{\alpha+1}x\right )e^{-nx} \\ &\rightarrow f_n'(x)=0 \Rightarrow \left (n^{\alpha}-n^{\alpha+1}x\right )e^{-nx}=0 \Rightarrow n^{\alpha}-n^{\alpha+1}x=0 \Rightarrow x=\frac{1}{n} \\ &f_n\left (\frac{1}{n}\right )=\frac{n^{\alpha-1}}{e}\end{align*}
Since we have the intervall $[\alpha, +\infty)$ we have to check also the value of the function at $x=\alpha$, right?
We get $f_n(\alpha)=n^{\alpha}\alpha e^{-n\alpha}$. But how can we compare it with $\frac{n^{\alpha-1}}{e}$ ? Or do we have to check if $f_n(x)$ is increasing or decreasing? :unsure:
- Show that $f_n$ converges uniformly on $[0, +\infty)$ iff $a<1$.

The maximum is $\frac{n^{\alpha-1}}{e}$ (since at $x=0$ we have $f_n(0)=0$ which is smaller) and for $a<1$ the limit goes to $0$, and this means that $f_n$ converges uniformly, right?
This shows that if $a<1$ then $f_n$ converges uniformly, or not? It is left to show that if $f_n$ converges uniformly then $a<1$, or not? :unsure:
 
Physics news on Phys.org
  • #2
mathmari said:
- Show that $(f_n)$ converges pointwise on $[0,+\infty)$.

Is everything correct?

Hey mathmari!

It looks correct to me. (Nod)

mathmari said:
- Calculate $\max_{x\in [0, +\infty)}f_n(x)$ and conclude that $f_n$ converges uniformly on $[a, +\infty)$ for $a>0$.

We have that
\begin{align*}&f_n(x)=n^{\alpha}xe^{-nx}\\ &\rightarrow f_n'(x)=n^{\alpha}e^{-nx}-n^{\alpha+1}xe^{-nx}=\left (n^{\alpha}-n^{\alpha+1}x\right )e^{-nx} \\ &\rightarrow f_n'(x)=0 \Rightarrow \left (n^{\alpha}-n^{\alpha+1}x\right )e^{-nx}=0 \Rightarrow n^{\alpha}-n^{\alpha+1}x=0 \Rightarrow x=\frac{1}{n} \\ &f_n\left (\frac{1}{n}\right )=\frac{n^{\alpha-1}}{e}\end{align*}
Since we have the intervall $[\alpha, +\infty)$ we have to check also the value of the function at $x=\alpha$, right?
We get $f_n(\alpha)=n^{\alpha}\alpha e^{-n\alpha}$. But how can we compare it with $\frac{n^{\alpha-1}}{e}$ ? Or do we have to check if $f_n(x)$ is increasing or decreasing? :unsure:

I'm a bit confused about the latin $a$ versus the greek $\alpha$.
Aren't they distinct? 🤔
Note that we can prove the statement for any latin $a>0$ independent of the value of the greek $\alpha$ that is in the exponent.

You have found that $f_n'$ has a zero at $x=\frac 1n$. We also have that $f_n'$ is decreasing.
Therefore for a given $n$ we have that $f_n$ has a maximum at $x=\frac 1n$.

Let $N=\left\lceil \frac 1 {a}\right\rceil$.
Then for any $n> N$ we have that $f_n$ is decreasing on $[a,\infty)$.
So for such $n$ the function $f_n$ has its maximum at $x=a$.
If $n$ becomes bigger, does this maximum increase or not?
If it doesn't, we have uniform convergence don't we? 🤔

mathmari said:
- Show that $f_n$ converges uniformly on $[0, +\infty)$ iff $a<1$.

The maximum is $\frac{n^{\alpha-1}}{e}$ (since at $x=0$ we have $f_n(0)=0$ which is smaller) and for $a<1$ the limit goes to $0$, and this means that $f_n$ converges uniformly, right?
This shows that if $a<1$ then $f_n$ converges uniformly, or not? It is left to show that if $f_n$ converges uniformly then $a<1$, or not?

Indeed. Or rather, that $f_n$ does not converge uniformly when $a\ge 1$.
What happens to the maximum of $f_n$ when $n$ increases with $a\ge 1$? 🤔
 
  • #3
Klaas van Aarsen said:
I'm a bit confused about the latin $a$ versus the greek $\alpha$.
Aren't they distinct? 🤔
Note that we can prove the statement for any latin $a>0$ independent of the value of the greek $\alpha$ that is in the exponent.

At the intervall $[a, +\infty)$ we have the latin $a$ and at the function we have the greek letter.
Klaas van Aarsen said:
You have found that $f_n'$ has a zero at $x=\frac 1n$. We also have that $f_n'$ is decreasing.
Therefore for a given $n$ we have that $f_n$ has a maximum at $x=\frac 1n$.

Let $N=\left\lceil \frac 1 {a}\right\rceil$.
Then for any $n> N$ we have that $f_n$ is decreasing on $[a,\infty)$.
So for such $n$ the function $f_n$ has its maximum at $x=a$.
If $n$ becomes bigger, does this maximum increase or not?
If it doesn't, we have uniform convergence don't we? 🤔

So that means since $f_n$ is decreasing we have that for $\frac{1}{n}>a$ we have that $f_n\left (\frac{1}{n}\right )<f_n(a)$ which means that the maximum of $f_n$ on $[a, +\infty)$ is $f_n(a)$, right? :unsure:

To check the uniform convergence we have to check if $\displaystyle{\lim_{n\rightarrow +\infty}f_n(a)=0}$, right? :unsure:
Klaas van Aarsen said:
Indeed. Or rather, that $f_n$ does not converge uniformly when $a\ge 1$.
What happens to the maximum of $f_n$ when $n$ increases with $a\ge 1$? 🤔

In this case the maximum is $f_n\left (\frac{1}{n}\right )=\frac{n^{\alpha-1}}{e}$ and we have that $\displaystyle{\lim_{n\rightarrow +\infty}\frac{n^{\alpha-1}}{e}=\begin{cases} +\infty& \text{ if } \alpha-1>0 \\ \frac{1}{e} & \text{ if } \alpha-1=0 \\ 0 & \text{ if } \alpha-1<0 \end{cases}}$.

Can we say then that $f_n$ converges uniformly iff $\alpha-1<0 \Rightarrow \alpha<1$ ? Or doesn't the "iff" follow in that way? :unsure:
 
  • #4
mathmari said:
So that means since $f_n$ is decreasing we have that for $\frac{1}{n}>a$ we have that $f_n\left (\frac{1}{n}\right )<f_n(a)$ which means that the maximum of $f_n$ on $[a, +\infty)$ is $f_n(a)$, right?

To check the uniform convergence we have to check if $\displaystyle{\lim_{n\rightarrow +\infty}f_n(a)=0}$, right?

Yes.
That is, it will suffice if that is the case.

Now how can we do that? 🤔

mathmari said:
In this case the maximum is $f_n\left (\frac{1}{n}\right )=\frac{n^{\alpha-1}}{e}$ and we have that $\displaystyle{\lim_{n\rightarrow +\infty}\frac{n^{\alpha-1}}{e}=\begin{cases} +\infty& \text{ if } \alpha-1>0 \\ \frac{1}{e} & \text{ if } \alpha-1=0 \\ 0 & \text{ if } \alpha-1<0 \end{cases}}$.

Can we say then that $f_n$ converges uniformly iff $\alpha-1<0 \Rightarrow \alpha<1$ ? Or doesn't the "iff" follow in that way?

We have proven that if $a<1$ that then $f_n$ converges uniformly (to the zero function).
However, the proof that if $f_n$ converges uniformly, that then $a<1$ is still incomplete.
That is, we have the edge case $\alpha=1$ where $f_n$ might still be uniformly convergent. 🤔

So suppose $\alpha=1$ and $f_n$ is uniformly convergent.
Then we must have that there is a function $f$ such that:
$$\lim_{n\rightarrow\infty}\,\sup\{\,\left|f_n(x)-f(x)\right|: x \in [0,\infty) \,\}=0$$
We have found that in this case $f_n(x)$ is at most $\frac 1 e$.
Could there be such a function $f$ that is also at most $\frac 1 e$? 🤔
 
Last edited:
  • #5
Klaas van Aarsen said:
Yes.
That is, it will suffice if that is the case.

Now how can we do that? 🤔

Just to clarify something... We have shown that $x=\frac{1}{n}$ is a critical point. We have the following:
$$f_n''(x)=-n^{\alpha+1}e^{-nx}-n\left (n^{\alpha}-n^{\alpha+1}x\right )e^{-nx}=\left (-n^{\alpha+1}-n^{\alpha+1}+n^{\alpha+2}x\right )e^{-nx}=\left (-2n^{\alpha+1}+n^{\alpha+2}x\right )e^{-nx} \\ f_n''\left(\frac{1}{n}\right )=\left (-2n^{\alpha+1}+n^{\alpha+1}\right )e^{-1}=-n^{\alpha+1}e^{-1}<0$$ That means that at $x=\frac{1}{n}$ the function $f_n(x)$ has a maximum.

So for $x<\frac{1}{n}$ the function is increasing and for $x>\frac{1}{n}$ the function is decreasing.

So if $a<\frac{1}{n}$ we have that $f_n(a)<f_n\left (\frac{1}{n}\right )$ and if $a>\frac{1}{n}$ we have that $f_n(a)<f_n\left (\frac{1}{n}\right )$ but in that case $\frac{1}{n}$ is not in the intervall $[a, +\infty)$ so the maximumis at the boundary $x=a$.

Is that correct so far? :unsure:
 
  • #6
Yep. That looks correct. (Nod)
 
  • #7
Klaas van Aarsen said:
Yep. That looks correct. (Nod)

If $a<\frac{1}{n}$ the maximum is $f_n\left(\frac{1}{n}\right )=n^{\alpha-1}e^{-1}$.
We have that \begin{equation*}\lim_{n\rightarrow +\infty}f_n\left(\frac{1}{n}\right )=\lim_{n\rightarrow +\infty}n^{\alpha-1}e^{-1}=\lim_{n\rightarrow +\infty}\frac{n^{\alpha-1}}{e}\end{equation*}

If $a>\frac{1}{n}$ the maximum is $f_n(a)=n^{\alpha}a e^{-na}$.
We have that \begin{equation*}\lim_{n\rightarrow +\infty}f_n\left(a\right )=\lim_{n\rightarrow +\infty}n^{\alpha}a e^{-na}=\lim_{n\rightarrow +\infty}\frac{n^{\alpha}a}{e^{na}}\end{equation*}

To calculate in each case the liimit do we have to distinguish cases for $\alpha$ ?
 
  • #8
We can ignore latin $a<\frac 1n$, since it suffices to check for n that are 'large enough'. 🧐

So we want to know what $f_n(a)$ does for increasing n that are large enough.
I don't think it is necessary to check cases for greek $\alpha$.
How about checking the behavior of $g(n)=f_n(a)$? 🤔
 
  • #9
Klaas van Aarsen said:
We can ignore latin $a<\frac 1n$, since it suffices to check for n that are 'large enough'. 🧐

So we want to know what $f_n(a)$ does for increasing n that are large enough.
I don't think it is necessary to check cases for greek $\alpha$.
How about checking the behavior of $g(n)=f_n(a)$? 🤔

\begin{align*}g(n)&=f_n(a)=n^{\alpha}a e^{-na}=\frac{n^{\alpha}a}{ e^{na}} \\ g'(n)&=\frac{a \alpha n^{\alpha-1}e^{na}-n^{\alpha}a^2e^{na}}{e^{2na}}=\frac{a \alpha n^{\alpha-1}-n^{\alpha}a^2}{e^{na}} \\ g''(n)&=\frac{\left (a \alpha (\alpha-1) n^{\alpha-2}-\alpha n^{\alpha-1}a^2\right )e^{na}-a\left (a \alpha n^{\alpha-1}-n^{\alpha}a^2\right )e^{na}}{e^{2na}}\\ & =\frac{a \alpha (\alpha-1) n^{\alpha-2}-\alpha n^{\alpha-1}a^2-a^2 \alpha n^{\alpha-1}+n^{\alpha}a^3}{e^{na}}\\ & =\frac{a \alpha (\alpha-1) n^{\alpha-2}-2\alpha n^{\alpha-1}a^2+n^{\alpha}a^3}{e^{na}} \\ g'(n)&=0 \Rightarrow an^{\alpha-1} \left ( \alpha -na\right )=0 \Rightarrow \alpha -na=0 \Rightarrow n=\frac{\alpha}{a} \\ g'' \left (\frac{\alpha}{a}\right )&=\frac{a \alpha (\alpha-1) \left (\frac{\alpha}{a}\right )^{\alpha-2}-2\alpha \left (\frac{\alpha}{a}\right )^{\alpha-1}a^2+\left (\frac{\alpha}{a}\right )^{\alpha}a^3}{e^{\alpha}}\\ & =\frac{a \alpha (\alpha-1) \alpha^{\alpha-2}a^{-\alpha+2}-2\alpha \alpha^{\alpha-1}a^{-\alpha+1} a^2+\alpha^{\alpha}a^{-\alpha}a^3}{e^{\alpha}}=\frac{ (\alpha-1) \alpha^{\alpha-1}a^{-\alpha+3}-2 \alpha^{\alpha}a^{-\alpha+3} +\alpha^{\alpha}a^{-\alpha+3}}{e^{\alpha}}\\ & =\frac{ (\alpha-1) \alpha^{\alpha-1}a^{-\alpha+3}- \alpha^{\alpha}a^{-\alpha+3} }{e^{\alpha}}=\frac{ \alpha \alpha^{\alpha-1}a^{-\alpha+3}-\alpha^{\alpha-1}a^{-\alpha+3}- \alpha^{\alpha}a^{-\alpha+3} }{e^{\alpha}}\\ & =\frac{ \alpha^{\alpha}a^{-\alpha+3}-\alpha^{\alpha-1}a^{-\alpha+3}- \alpha^{\alpha}a^{-\alpha+3} }{e^{\alpha}}=\frac{ -\alpha^{\alpha-1}a^{-\alpha+3}}{e^{\alpha}}<0\end{align*} So $g_n$ has a maximum at $n=\frac{\alpha}{a}$.

Is that correct? How does this help us? I got stuck right now. :unsure:
 
  • #10
mathmari said:
So $g_n$ has a maximum at $n=\frac{\alpha}{a}$.

Is that correct? How does this help us? I got stuck right now.

Looks correct.
Doesn't that mean that $g(n)$ is decreasing for sufficiently large $n$? 🤔
Its limit is $0$ isn't it?
So $\lim\limits_{n\to\infty} \sup \{|f_n(x)| : x\in [a,\infty)\} = 0$, isn't it?
 
  • #11
Ahh I got it! As for the last question :
Klaas van Aarsen said:
We have proven that if $a<1$ that then $f_n$ converges uniformly (to the zero function).
However, the proof that if $f_n$ converges uniformly, that then $a<1$ is still incomplete.
That is, we have the edge case $\alpha=1$ where $f_n$ might still be uniformly convergent. 🤔

So suppose $\alpha=1$ and $f_n$ is uniformly convergent.
Then we must have that there is a function $f$ such that:
$$\lim_{n\rightarrow\infty}\,\sup\{\,\left|f_n(x)-f(x)\right|: x \in [0,\infty) \,\}=0$$
We have found that in this case $f_n(x)$ is at most $\frac 1 e$.
Could there be such a function $f$ that is also at most $\frac 1 e$? 🤔

In this case the maximum is $f_n\left (\frac{1}{n}\right )=\frac{n^{\alpha-1}}{e}$ and we have that $\displaystyle{\lim_{n\rightarrow +\infty}\frac{n^{\alpha-1}}{e}=\begin{cases} +\infty& \text{ if } \alpha-1>0 \\ \frac{1}{e} & \text{ if } \alpha-1=0 \\ 0 & \text{ if } \alpha-1<0 \end{cases}}$.

If $a<1$ then $f_n$ converges uniformly (to the zero function).

If $f_n$ converges uniformly, then $\displaystyle{\lim_{n\rightarrow\infty}\,\sup\{\,\left|f_n(x)-f(x)\right|: x \in [0,\infty) \,\}=0}$.
How do we continue here to take cases for $\alpha$ ? I got stuck right now. :unsure:
 
  • #12
mathmari said:
If $f_n$ converges uniformly, then $\displaystyle{\lim_{n\rightarrow\infty}\,\sup\{\,\left|f_n(x)-f(x)\right|: x \in [0,\infty) \,\}=0}$.
How do we continue here to take cases for $\alpha$ ? I got stuck right now.

We want to prove that uniform convergence of $f_n$ implies that $\alpha<1$.

Let's try a proof by contradiction. 🤔
Suppose that it doesn't. Then there must be an $\alpha \ge 1$ such that $f_n$ converges uniformly.

If $a>1$, then $\sup f_n(x)\to\infty$, so whatever we pick for $f$, we won't have that $\displaystyle{\lim_{n\rightarrow\infty}\,\sup\{\,\left|f_n(x)-f(x)\right|: x \in [0,\infty) \,\}=0}$.
Therefore $\alpha=1$.
For $\alpha=1$ we have that $f_n(x)$ has a maximum of $\frac 1e$.
So $f$ must also have a maximum of $\frac 1e$.
Suppose $f(x)=\frac 1e$ for some $x=a$.
We have seen that if $a>0$, that $f_n(a)\to 0$, so $|f_n(a)-f(a)| = |f_n(a)-\frac 1e| \to \frac 1e \ne 0$, which contradicts that $f_n$ converges uniformly.
Consequently $a=0$.
But then $|f_n(a)-f(a)| =|0-\frac 1e|=\frac 1e\ne 0$, which is again a contradiction.

Therefore $\alpha<1$, which completes the proof. :geek:
 

FAQ: Sequence of functions : pointwise & uniform convergence

What is the difference between pointwise and uniform convergence?

Pointwise convergence means that for each point in the domain, the sequence of functions converges to the corresponding point in the range. Uniform convergence means that the sequence of functions converges to the same point at the same rate for all points in the domain.

How do you determine if a sequence of functions converges pointwise or uniformly?

A sequence of functions converges pointwise if for every value of x in the domain, the sequence of function values approaches the corresponding value in the range. A sequence of functions converges uniformly if for every ε > 0, there exists an N such that for all n ≥ N, the distance between the nth function and the limit function is less than ε for all x in the domain.

Can a sequence of functions converge pointwise but not uniformly?

Yes, a sequence of functions can converge pointwise but not uniformly. This can occur when the rate of convergence varies at different points in the domain, resulting in a non-uniform convergence.

What are some real-world applications of pointwise and uniform convergence?

Pointwise and uniform convergence are important concepts in the field of numerical analysis, where they are used to analyze the convergence of numerical methods for solving differential equations and other problems. They are also used in the study of Fourier series and in the analysis of algorithms in computer science.

Is uniform convergence stronger than pointwise convergence?

Yes, uniform convergence is considered to be a stronger form of convergence than pointwise convergence. This is because uniform convergence guarantees that the sequence of functions is converging at the same rate for all points in the domain, while pointwise convergence only guarantees convergence at each individual point.

Similar threads

Replies
4
Views
2K
Replies
9
Views
1K
Replies
3
Views
1K
Replies
21
Views
2K
Replies
11
Views
905
Replies
7
Views
2K
Replies
5
Views
633
Replies
2
Views
1K
Back
Top