Neighbourhood of Convergence of Sequence

In summary: Sorry, that's a typo. :) It should be,\[\epsilon_{n+1}=-\frac{f''(\gamma_n)}{2f'(x_n)}\epsilon_{n}^2\]It's given that \(f\) is twice differentiable. Hence \(f'\) and \(f''\) exists. But I thought that \(f'(x_n)\neq 0\) is implied through the equation generating the terms of the sequence. What I felt from the beginning when solving this problem is how to incoperate the fact that \(f'(x_0)\neq 0\). And I see you picked $\gamma_n$. Didn't you like $\xi_n$?;) No I
  • #1
Sudharaka
Gold Member
MHB
1,568
1
Hi everyone, :)

Can somebody give me a hint to solve this problem. :)

Problem:

Let \(f\) be a function defined on \([a,\,b]\) with continuous second order derivative. Let \(x_0\in (a,\,b)\) satisfy \(f(x_0)=0\) but \(f'(x_0)\neq 0\). Prove that, there is a neighbourhood of \(x_0\), say \(U(x_0)\), such that, for all \(x_1\in U(x_0)\), the following sequence,

\[x_{n+1}=x_{n}-\frac{f(x_n)}{f'(x_n)}\]

where \(n=1,\,2,\,\cdots\) is convergent.
 
Physics news on Phys.org
  • #2
Sudharaka said:
Hi everyone, :)

Can somebody give me a hint to solve this problem. :)

Problem:

Let \(f\) be a function defined on \([a,\,b]\) with continuous second order derivative. Let \(x_0\in (a,\,b)\) satisfy \(f(x_0)=0\) but \(f'(x_0)\neq 0\). Prove that, there is a neighbourhood of \(x_0\), say \(U(x_0)\), such that, for all \(x_1\in U(x_0)\), the following sequence,

\[x_{n+1}=x_{n}-\frac{f(x_n)}{f'(x_n)}\]

where \(n=1,\,2,\,\cdots\) is convergent.

Use Taylor's theorem:
$$f(x_n) = f(x_0) + f'(x_0)(x_n - x_0) + \frac 1 {2!} f''(\xi_n)(x_n - x_0)^2$$
where $\xi_n$ is between $x_0$ and $x_n$.

Let $\varepsilon_n = x_0 - x_n$.
This is the error with respect to the root of f in iteration n.
Consider the error $\varepsilon_{n+1}$ in terms of $\varepsilon_n$.
If it approaches zero, you're done.
 
  • #3
I like Serena said:
Use Taylor's theorem:
$$f(x_n) = f(x_0) + f'(x_0)(x_n - x_0) + \frac 1 {2!} f''(\xi_n)(x_n - x_0)^2$$
where $\xi_n$ is between $x_0$ and $x_n$.

Let $\varepsilon_n = x_0 - x_n$.
This is the error with respect to the root of f in iteration n.
Consider the error $\varepsilon_{n+1}$ in terms of $\varepsilon_n$.
If it approaches zero, you're done.

Thanks very much for the reply but I am not sure whether I get you. If the error approaches zero then the sequence converges. But how does that guarantee the existence of a neighbourhood \(U(x_0)\) where each element \(x_1\) makes the sequence convergent?
 
  • #4
Sudharaka said:
Thanks very much for the reply but I am not sure whether I get you. If the error approaches zero then the sequence converges. But how does that guarantee the existence of a neighbourhood \(U(x_0)\) where each element \(x_1\) makes the sequence convergent?

If you work it out, you'll find there are some boundary conditions.
To satisfy those boundary conditions you need to take $x_1$ "close enough" to $x_0$, or equivalently $\varepsilon_1$ "close enough" to 0.
This is represented by a neighbourhood $U(x_0)$ that is "small enough".
 
  • #5
Sudharaka said:
Hi everyone, :)

Can somebody give me a hint to solve this problem. :)

Problem:

Let \(f\) be a function defined on \([a,\,b]\) with continuous second order derivative. Let \(x_0\in (a,\,b)\) satisfy \(f(x_0)=0\) but \(f'(x_0)\neq 0\). Prove that, there is a neighbourhood of \(x_0\), say \(U(x_0)\), such that, for all \(x_1\in U(x_0)\), the following sequence,

\[x_{n+1}=x_{n}-\frac{f(x_n)}{f'(x_n)}\]

where \(n=1,\,2,\,\cdots\) is convergent.

A counterexample shoul be $\displaystyle f(x)= x^{\frac{1}{3}}$ because the difference equation becomes...

$\displaystyle \Delta_{n} = x_{n+1} - x_{n} = - 3\ x_{n}\ (1)$

... that diverges for any $x_{1} \ne 0$...

Kind regards

$\chi$ $\sigma$
 
  • #6
chisigma said:
A counterexample shoul be $\displaystyle f(x)= x^{\frac{1}{3}}$ because the difference equation becomes...

It doesn't satisfy the criteria.
f'(0) is not defined, so it does not have a continuous second order derivative around x=0.
 
  • #7
I like Serena said:
If you work it out, you'll find there are some boundary conditions.
To satisfy those boundary conditions you need to take $x_1$ "close enough" to $x_0$, or equivalently $\varepsilon_1$ "close enough" to 0.
This is represented by a neighbourhood $U(x_0)$ that is "small enough".

Thanks, now I am understanding it more clearly. So we can get the \(\epsilon_{n+1}\) in terms of \(\epsilon_n\) as mentioned >>here<<.

\[\epsilon_{n+1}=-\frac{f''(\gamma_n)}{2f'(x_n)}\epsilon_{n}\]

I hope I am correct up to this point. Am I?
 
  • #8
Sudharaka said:
Thanks, now I am understanding it more clearly. So we can get the \(\epsilon_{n+1}\) in terms of \(\epsilon_n\) as mentioned >>here<<.

\[\epsilon_{n+1}=-\frac{f''(\gamma_n)}{2f'(x_n)}\epsilon_{n}\]

I hope I am correct up to this point. Am I?

Almost.
You dropped a square.

Note that you've already used that both f' and f'' exist, and that $f'(x_n)\ne 0$.
Are you aware of the conditions involved?

And I see you picked $\gamma_n$. Didn't you like $\xi_n$?

498-xi-xi-xi.png
 
  • #9
I like Serena said:
Almost.
You dropped a square.

Note that you've already used that both f' and f'' exist, and that $f'(x_n)\ne 0$.
Are you aware of the conditions involved?

Sorry, that's a typo. :) It should be,

\[\epsilon_{n+1}=-\frac{f''(\gamma_n)}{2f'(x_n)}\epsilon_{n}^2\]

It's given that \(f\) is twice differentiable. Hence \(f'\) and \(f''\) exists. But I thought that \(f'(x_n)\neq 0\) is implied through the equation generating the terms of the sequence. What I felt from the beginning when solving this problem is how to incoperate the fact that \(f'(x_0)\neq 0\).

I like Serena said:
And I see you picked $\gamma_n$. Didn't you like $\xi_n$?

;) No I hate that symbol. First I never can write it properly and in this context I don't like to use it because it looks like \(\epsilon\) and there's a chance I will confuse the two.
 
  • #10
Sudharaka said:
Sorry, that's a typo. :) It should be,

\[\epsilon_{n+1}=-\frac{f''(\gamma_n)}{2f'(x_n)}\epsilon_{n}^2\]

It's given that \(f\) is twice differentiable. Hence \(f'\) and \(f''\) exists. But I thought that \(f'(x_n)\neq 0\) is implied through the equation generating the terms of the sequence. What I felt from the beginning when solving this problem is how to incoperate the fact that \(f'(x_0)\neq 0\).

It's the other way around.
You can only use $f'(x_n)\neq 0$ because you have $f'(x_0)\neq 0$.

Due to the fact that f' is continuous and that $f'(x_0)\neq 0$, you can infer that there will be some open interval around $x_0$ that will have $f'(x) \ne 0$ for x in that interval.

Only within that interval can you use the Taylor expansion as given.
 
  • #11
I like Serena said:
It's the other way around.
You can only use $f'(x_n)\neq 0$ because you have $f'(x_0)\neq 0$.

Due to the fact that f' is continuous and that $f'(x_0)\neq 0$, you can infer that there will be some open interval around $x_0$ that will have $f'(x) \ne 0$ for x in that interval.

Only within that interval can you use the Taylor expansion as given.

This occurred me previously but I was confused by the fact that if we take that interval how do we know for sure that all the values \(x_n\) lie in that interval? That is we choose \(x_1\) from that interval, then calculate the value \(x_2\). Now how can we guarantee that \(x_2\) also lie in that same interval?
 
  • #12
Sudharaka said:
This occurred me previously but I was confused by the fact that if we take that interval how do we know for sure that all the values \(x_n\) lie in that interval? That is we choose \(x_1\) from that interval, then calculate the value \(x_2\). Now how can we guarantee that \(x_2\) also lie in that same interval?

That happens if we can make sure that $|\varepsilon_{n+1}| < |\varepsilon_n|$.
 
  • #13
I like Serena said:
That happens if we can make sure that $|\varepsilon_{n+1}| < |\varepsilon_n|$.

Yes, I think I am getting a hold of this. So we have the inequality,

\[|\epsilon_{n+1}|=\left|\frac{f''(\gamma_n)}{2f'(x_n)}\right||\epsilon_{n}|^2\]

To get $|\varepsilon_{n+1}| < |\varepsilon_n|$, we should have,

\[|\epsilon_{n}|<\left|\frac{2f'(x_n)}{f''(\gamma_n)}\right|\]

And this is the interval that we are looking for. Am I correct? :)
 
  • #14
Sudharaka said:
Yes, I think I am getting a hold of this. So we have the inequality,

\[|\epsilon_{n+1}|=\left|\frac{f''(\gamma_n)}{2f'(x_n)}\right||\epsilon_{n}|^2\]

To get $|\varepsilon_{n+1}| < |\varepsilon_n|$, we should have,

\[|\epsilon_{n}|<\left|\frac{2f'(x_n)}{f''(\gamma_n)}\right|\]

And this is the interval that we are looking for. Am I correct? :)

Yes.
With the additional constraints that we're inside the interval $(a,b)$ and that $f'(x) \ne 0$.
Also note that $f''(\gamma_n)$ could be zero, so we have to allow for that.
 
  • #15
I like Serena said:
Yes.
With the additional constraints that we're inside the interval $(a,b)$ and that $f'(x) \ne 0$.
Also note that $f''(\gamma_n)$ could be zero, so we have to allow for that.

Well, so one last question. We have to assume that \(f''(\gamma_n)\neq 0\). This is just something we have to assume and cannot be deducted from the given details. Am I correct? :)
 
  • #16
Sudharaka said:
Well, so one last question. We have to assume that \(f''(\gamma_n)\neq 0\). This is just something we have to assume and cannot be deducted from the given details. Am I correct? :)

Not really.
If f''(x) = 0, we have instantaneous convergence to the root.
 
  • #17
I like Serena said:
Not really.
If f''(x) = 0, we have instantaneous convergence to the root.

And that I believe tells us \(f\) is a straight line. Isn't? But what's wrong with that?
 
  • #18
Sudharaka said:
And that I believe tells us \(f\) is a straight line. Isn't? But what's wrong with that?

Yep and nothing's wrong with that.
And btw, it is not given that f''(x) = 0 everywhere.
What it does mean, is that $|\varepsilon_{n+1}| < |\varepsilon_{n}|$ if $\varepsilon_{n} \ne 0$, which is what we wanted.
 
  • #19
I like Serena said:
Yep and nothing's wrong with that.
And btw, it is not given that f''(x) = 0 everywhere.
What it does mean, is that $|\varepsilon_{n+1}| < |\varepsilon_{n}|$ if $\varepsilon_{n} \ne 0$, which is what we wanted.

Yes, I am sorry, too tired to understand that \(f''(\gamma)=0\) does not mean that \(f\) is a straight line. However if \(f''(\gamma)=0\) then \(x_{n+1}=x_0\) so obviously,

\[0=|\varepsilon_{n+1}| < |\varepsilon_{n}|\]

I guess this is what you meant. Am I correct? :)
 
  • #20
Sudharaka said:
Yes, I am sorry, too tired to understand that \(f''(\gamma)=0\) does not mean that \(f\) is a straight line. However if \(f''(\gamma)=0\) then \(x_{n+1}=x_0\) so obviously,

\[0=|\varepsilon_{n+1}| < |\varepsilon_{n}|\]

I guess this is what you meant. Am I correct?

Yes.
 
  • #21
I like Serena said:
Yes.

Thank you sooooooooooooooooooo much for all your help. I think I understood every bit and piece of the problem, though it took a considerable amount of time. :)
 

FAQ: Neighbourhood of Convergence of Sequence

What is the Neighbourhood of Convergence of Sequence?

The Neighbourhood of Convergence of Sequence is a mathematical concept used to describe the set of points in the complex plane where a given sequence of complex numbers converges. It is a region surrounding the limit point of the sequence and is typically represented by a circle with a certain radius.

How is the Neighbourhood of Convergence of Sequence determined?

The Neighbourhood of Convergence of Sequence is determined by analyzing the behavior of the sequence as its index (or position) approaches infinity. If the sequence converges, the Neighbourhood of Convergence will be a circle centered at the limit point of the sequence. The radius of the circle is determined by the distance between the limit point and the closest point where the sequence diverges.

What is the significance of the Neighbourhood of Convergence of Sequence?

The Neighbourhood of Convergence of Sequence is an important concept in complex analysis, as it helps us understand the behavior of sequences of complex numbers and their limits. It is also used in many other areas of mathematics, such as in the study of power series and analytic functions.

Can the Neighbourhood of Convergence of Sequence be infinite?

Yes, the Neighbourhood of Convergence of Sequence can be infinite. This occurs when the sequence has an infinite limit point, meaning that the sequence does not converge to a single point but instead approaches infinity as its index increases.

How is the Neighbourhood of Convergence of Sequence related to the Radius of Convergence?

The Radius of Convergence is a concept used in power series to determine the set of points where the series converges. The Neighbourhood of Convergence of Sequence is a generalization of this concept, as it applies to any sequence of complex numbers and not just power series. The Radius of Convergence can be seen as the radius of the Neighbourhood of Convergence for a specific sequence.

Similar threads

Replies
11
Views
1K
Replies
4
Views
1K
Replies
3
Views
1K
Replies
9
Views
1K
Replies
7
Views
2K
Back
Top