- #1
MIB
- 17
- 0
I think there is some thing wrong in this exercise which I met by chance in a book of Calculus and analysis while looking for rigorous definition for angle , it says
let f : (a,b) → ℝ be a differentiable function , suppose that f' is bounded , and that f has a root r in (a,b) . suppose that for x ≠ r , Jx denote the open interval between x and r , where if f(x) > 0 then f is convex on Jx , and if f(x) < 0 then f is concave on Jx.Then prove that for any x0 in (a,b) the Newton sequence converges to a root where x0 is it initial point .
The problem here is that we can take the initial point x0 where f'(x0)=0 .for the simplicity consider f(x) := x^3 for all x in (-4,4)
let f : (a,b) → ℝ be a differentiable function , suppose that f' is bounded , and that f has a root r in (a,b) . suppose that for x ≠ r , Jx denote the open interval between x and r , where if f(x) > 0 then f is convex on Jx , and if f(x) < 0 then f is concave on Jx.Then prove that for any x0 in (a,b) the Newton sequence converges to a root where x0 is it initial point .
The problem here is that we can take the initial point x0 where f'(x0)=0 .for the simplicity consider f(x) := x^3 for all x in (-4,4)