- #1
annie122
- 51
- 0
I have a question for a programming exercise I'm working on for C.
The problem is to "Write a program that uses Newton's method to approximate the nth root of a number to six decimal places." The problem also said to terminate after 100 trials if it failed to converge.
Q1. What does "converge" mean?
Does it mean the difference between two approximation can be made as small as I like?
Q2. On what condition should the program terminate?
There are two such conditions: 1) if the loop has been executed 100 times, 2) the difference between the "true" answer and the approximation is less than 0.000001.
I know how to set 1), but how should I express 2)?
Right now, I am setting the condition as
|approximation - root| < 0.00001,
but I feel it's kind of cheating, because I'm not supposed to know the real answer if I'm making approximations.
Are there any other any ways to express the condition, especially one involving the function f(x) = x^n - c (x is the nth root of c)?
The problem is to "Write a program that uses Newton's method to approximate the nth root of a number to six decimal places." The problem also said to terminate after 100 trials if it failed to converge.
Q1. What does "converge" mean?
Does it mean the difference between two approximation can be made as small as I like?
Q2. On what condition should the program terminate?
There are two such conditions: 1) if the loop has been executed 100 times, 2) the difference between the "true" answer and the approximation is less than 0.000001.
I know how to set 1), but how should I express 2)?
Right now, I am setting the condition as
|approximation - root| < 0.00001,
but I feel it's kind of cheating, because I'm not supposed to know the real answer if I'm making approximations.
Are there any other any ways to express the condition, especially one involving the function f(x) = x^n - c (x is the nth root of c)?