[Numerical analysis] Stability and condition of Newton's method

In summary: Again, I do not know what you mean by "fraction".)In summary, stability and condition are related concepts in mathematics, but they have different meanings. Condition is a measure of how much the output of an algorithm changes when the input changes. It is linked to the problem being solved, not the algorithm itself. On the other hand, stability refers to the behavior of an algorithm when small changes are made to the input. It is related to the concept of "error blowing up" and can be quantified by comparing the results of an algorithm with theoretical values or using computer programs. The fractal in Newton's method example does not necessarily mean the method is ill-conditioned, but it does demonstrate the sensitivity of the algorithm to small
  • #1
srn
17
0
I am confused by the concept of stability and condition. As I understand it, condition is defined by how much the output changes when the input changes. But why is it linked to the problem and not the algorithm? What if I have two algorithms that calculate the same thing but in a completely different manner, and the same change in input causes algorithm (a) to return more erroneous results than algorithm (b), can I not conclude (a) is worse conditioned than (b)? Or is that what stability is about? From Wikipedia: the difference between function and approximations, or "whether or not errors blow up". Is condition simply [itex]f(x) - f(x+\epsilon)[/itex] and stability [itex]f_{theory}(x) - f(x)[/itex]? If so I more or less understand it, but I can't apply it to Newton's method, for example.

Consider the fractal in http://mathworld.wolfram.com/NewtonsMethod.html. It is obvious that a small change in starting value can cause it to converge to a completely different root. Is the method then ill-conditioned? But condition is linked to the problem, so it would be the same for a different root solving method; but i.e. Müller's method has less trouble with changing start values. So it's dependent on algoritm and hence not condition. But then what does the fractal tell you?

Is it stability? I don't think so because there's 'nothing wrong', I mean, it's still going t oa root, it's not diverging or anything. I understand substracting two nearly equal numbers is unstable because the error "blows up", but in this case? Speaking about that, 1) how would you quantify stability when you have no theoretical values? Would you use a function built into a computerprogram and assume that it is implemented as accurately as possible and then compare with that? 2) how come you even get differences in accuracy between methods? In this case you would iteratively calculate a root until the difference between consequetive terms is lower than a tolerance. If you use the same tolerance, how can two values differ? It would mean that going from the step where error > tol to the step where it is <= tol, one method "added" more than the other.
 
Mathematics news on Phys.org
  • #3
I like Newton's method. It prevents the absurdity of having a square root of number being equal in magnitude to the square without breaking a sweat. That way, I'm comfortable in having this neat equations
sqrt. 1=0.9r
and conversely
sqrt. 0.9r=1
(I noticed that square roots of fractions have a higher numerical value than their squares while square roots of whole numbers are lesser).
NB. Am not insinuating that .9r is any lesser than 1. I've seen the FAQs.
 
  • #4
I presume that by "fraction" you mean "a number between 0 and 1". Mathematically, a fraction is any number of the for a/b where a and b are integers. That is, 30/5 is a fraction and 0.5 is not (unless you use the bastard phrase "decimal fraction").

Yes, if 0< x< 1 then, multiplying each part by the postive number x, 0< x^2< x while if 1< x, then x< x^2.
 
  • #5


I can provide some clarification on the concepts of stability and condition in numerical analysis, specifically in the context of Newton's method.

Stability refers to the ability of an algorithm to produce accurate and consistent results, even when small changes are made to the initial input. In other words, a stable algorithm should not produce drastically different results for slightly different inputs. In the case of Newton's method, the fractal you mention is an example of instability, as small changes in the starting value can lead to completely different roots being found. This is not desirable, as it means the algorithm is not producing reliable results.

On the other hand, condition refers to the sensitivity of the output to changes in the input. In the context of numerical analysis, this is often measured by the condition number, which can give an indication of how much the output will change when the input is perturbed. In the case of Newton's method, the condition number can be affected by the specific problem being solved and the starting value chosen. A problem with a high condition number can make the algorithm less reliable, as small changes in the input can lead to large changes in the output.

To address your confusion about the link between condition and the problem versus the algorithm, it is important to note that the condition number is specific to a particular problem and input, while the algorithm is the method used to solve that problem. So, while different algorithms may produce different results for the same problem, the condition number remains the same. This means that a problem can be well-conditioned (low condition number) for one algorithm, but poorly conditioned (high condition number) for another.

To answer your questions: 1) Quantifying stability without theoretical values can be challenging, but it can be done by comparing the results of the algorithm to known solutions or using benchmark problems with known solutions. 2) Differences in accuracy between methods can arise due to the specific implementation of the algorithm, the tolerance used, and the specific problem being solved. Even with the same tolerance, small differences in the calculations can lead to different outputs, resulting in slightly different results.

In summary, stability and condition are both important considerations in numerical analysis, and they are related but distinct concepts. While stability refers to the reliability and consistency of an algorithm, condition measures the sensitivity of the output to changes in the input. In the case of Newton's method, both stability and condition can be affected by the specific problem being solved and the starting value chosen.
 

FAQ: [Numerical analysis] Stability and condition of Newton's method

1. What is the stability of Newton's method and why is it important?

The stability of Newton's method refers to the ability of the algorithm to produce accurate and reliable results. It is important because if the method is not stable, the calculated solutions may be incorrect, leading to unreliable conclusions and predictions.

2. How is the stability of Newton's method measured?

The stability of Newton's method is typically measured in terms of its convergence rate. This is a measure of how quickly the algorithm approaches the true solution as the number of iterations increases. A higher convergence rate indicates a more stable method.

3. What factors can affect the stability of Newton's method?

There are several factors that can affect the stability of Newton's method, including the initial guess, the choice of function, and the presence of multiple solutions. Additionally, the condition number of the function, which measures its sensitivity to small changes in the input, can also impact the stability of the method.

4. How does the condition of the function impact the stability of Newton's method?

The condition of the function is a crucial factor in the stability of Newton's method. A function with a high condition number will be more sensitive to small changes in the input, making it more difficult for the algorithm to converge to the true solution. This can result in a loss of accuracy and stability in the method.

5. What are some strategies for improving the stability of Newton's method?

There are several strategies that can be used to improve the stability of Newton's method, including using a better initial guess, selecting a different function or variable transformation, and utilizing a different root-finding algorithm altogether. Additionally, adjusting the step size or implementing a damping factor can also help improve the stability of the method.

Back
Top