Quick question about convergence

  • Thread starter STEMucator
  • Start date
  • Tags
    Convergence
In summary, we have shown that for the given function ##s_n(x) = \frac{1}{n} e^{-(nx)^2}##, there exists a function ##s(x)## such that ##s_n(x) → s(x)## uniformly on ##ℝ## and that ##s_n'(x) → s'(x)## for every x. However, the convergence of the derivatives is not uniform in any interval containing the origin. This is demonstrated by calculating the supremum of the difference between the derivative of ##s_n(x)## and ##s(x)##, which does not go to zero as n approaches infinity. This shows that the convergence of the derivatives cannot be uniform.
  • #1
STEMucator
Homework Helper
2,076
140

Homework Statement



Let ##s_n(x) = \frac{1}{n} e^{-(nx)^2}##. Show there is a function ##s(x)## such that ##s_n(x) → s(x)## uniformly on ##ℝ## and that ##s_n'(x) → s'(x)## for every x, but that the convergence of the derivatives is not uniform in any interval which contains the origin.

Homework Equations



##s_n(x) → s(x)## as ##n→∞##

The Attempt at a Solution



For any real x, ##s_n(x) → 0 = s(x)## as ##n→∞## so we have pointwise convergence.

##\forall ε>0, \exists N(ε) \space | \space n>N \Rightarrow |s_n(x)-s(x)| < ε, \forall x \in ℝ##

##|s_n(x) - s(x)| ≤ 1/n##

So choosing ##n > 1/ε## means we have uniform convergence.

##s_n'(x) = -2xne^{-(nx)^2}##

Now, for all real x ##s_n'(x) → 0 = s'(x)## as ##n→∞##.

##|s_n'(x) - s'(x)| = 2|x|ne^{-(nx)^2}##

I'm stuck on showing the convergence of the derivatives is not uniform in any interval containing the origin. Don't really know how to argue this one.
 
Physics news on Phys.org
  • #2
Let ##(-\varepsilon,\varepsilon)## be an interval around the origin.

1) Can you calculate

[tex]A_n=\sup_{x\in (-\varepsilon,\varepsilon)} |s_n^\prime(x) - s^\prime(x)|[/tex]

2) Does ##A_n\rightarrow 0##?

3) What does (2) say about uniform convergence?
 
  • #3
micromass said:
Let ##(-\varepsilon,\varepsilon)## be an interval around the origin.

1) Can you calculate

[tex]A_n=\sup_{x\in (-\varepsilon,\varepsilon)} |s_n^\prime(x) - s^\prime(x)|[/tex]

2) Does ##A_n\rightarrow 0##?

3) What does (2) say about uniform convergence?

[itex]A_n=\sup_{x \in (-ε, ε)} |s_n'(x) - s'(x)| = \sup_{x \in (-ε, ε)} 2|x|ne^{-(nx)^2}[/itex]

That's some new notation on me. I've never seen sup used in such a way.

(2) I would assume not.

(3) Not uniform, but pointwise.
 
  • #4
Zondrina said:
[itex]A_n=\sup_{x \in (-ε, ε)} |s_n'(x) - s'(x)| = \sup_{x \in (-ε, ε)} 2|x|ne^{-(nx)^2}[/itex]

That's some new notation on me. I've never seen sup used in such a way.

Let's take a compact interval ##[-\varepsilon,\varepsilon]## instead then. Then the ##\sup## is actually a ##\max## then.

So, when I'm asking for ##\sup_{x \in [-\varepsilon,\varepsilon]} 2|x|ne^{-(nx)^2}##, I'm looking for the maximum value of this function.

So use calculus to find where the function ##g_n(x)= 2|x|e^{-(nx)^2}## takes on its maximum and calculate the maximum.
 
  • #5
micromass said:
Let's take a compact interval ##[-\varepsilon,\varepsilon]## instead then. Then the ##\sup## is actually a ##\max## then.

So, when I'm asking for ##\sup_{x \in [-\varepsilon,\varepsilon]} 2|x|ne^{-(nx)^2}##, I'm looking for the maximum value of this function.

So use calculus to find where the function ##g_n(x)= 2|x|ne^{-(nx)^2}## takes on its maximum and calculate the maximum.

Alright, that makes sense. I'll even pick the real open interval (-ε,ε) containing 0 since i see it doesn't matter now.

With how the interval and ##|s_n(x) - s(x)|## are defined I have 2 possible functions to examine.

Finding the maximums of these functions with regular calculus seems to be far beyond my understanding.

##|s_n(x) - s(x)| = 2xne^{-(nx)^2}## for ##0 ≤ x < ε##
##|s_n(x) - s(x)| = -2xne^{-(nx)^2}## for ##-ε < x ≤ 0##

How do I find the maximum? It's not like I can take the derivative and find criticals of this monster.
 
  • #6
Zondrina said:
It's not like I can take the derivative and find criticals of this monster.

Why not?
 
  • #7
micromass said:
Why not?

It looks terrible, but for the case where x is positive I took the derivative and found ##x = ± \frac{1}{ n \sqrt{2}}## to be the roots of the derivative.

For the case where x is negative, I took the derivative and found the same roots.
 
  • #8
Ok, so the maximum is reached in ##1/(n\sqrt{2})##. The value in that point is

[tex]f(1/(n\sqrt{2})) = \frac{2}{\sqrt{2}}e^{-1/2}[/tex]

So, we see that

[tex]\sup_{x\in (-\varepsilon,+\varepsilon)} |s_n^\prime(x)-s^\prime(x)| = \frac{2}{\sqrt{2}} e^{-1/2}[/tex]

Does this converge to ##0## if n goes to infinity?? (obviously not)

Can you deduce from this that convergence is not uniform?
 
  • #9
micromass said:
Ok, so the maximum is reached in ##1/(n\sqrt{2})##. The value in that point is

[tex]f(1/(n\sqrt{2})) = \frac{2}{\sqrt{2}}e^{-1/2}[/tex]

So, we see that

[tex]\sup_{x\in (-\varepsilon,+\varepsilon)} |s_n^\prime(x)-s^\prime(x)| = \frac{2}{\sqrt{2}} e^{-1/2}[/tex]

Does this converge to ##0## if n goes to infinity?? (obviously not)

Can you deduce from this that convergence is not uniform?

Ohhhhhh I see where you were going with this now. So since the supremum over (-ε,ε) of ##|s_n(x) - s(x)|## doesn't go to zero we know that the convergence of the derivatives can't possibly be uniform.

If it were to be uniform, the difference would go to zero if we subtract the sequence from the limiting function ( or vice versa ).
 

FAQ: Quick question about convergence

What is convergence?

Convergence refers to the process by which a sequence of numbers or functions approaches a certain value or limit as the number of terms or iterations increases.

How can I determine if a sequence is convergent?

A sequence is convergent if and only if the limit of the sequence exists and is finite. This can be determined by evaluating the limit of the sequence as the number of terms approaches infinity.

What is the difference between convergence and divergence?

Convergence refers to a sequence approaching a certain value or limit, while divergence refers to a sequence that does not have a limit and instead grows without bound.

What is the importance of convergence in mathematics and science?

Convergence is important in mathematics and science because it allows us to approximate values and make predictions based on limited data. It also helps us understand the behavior of functions and sequences, and is used in many areas of mathematics and science such as calculus, statistics, and physics.

What are some common methods for testing convergence?

Some common methods for testing convergence include the ratio test, the root test, and the comparison test. These methods involve evaluating the behavior of the terms in a sequence to determine if they approach a limit or grow without bound.

Similar threads

Replies
2
Views
3K
Replies
5
Views
364
Replies
31
Views
4K
Replies
7
Views
1K
Replies
3
Views
1K
Replies
4
Views
999
Replies
18
Views
2K
Back
Top