- #1
kandelabr
- 113
- 0
I have a set of data points that I must fit to an inverse function like this:
y(x) = a/(x+b) + c
My problem is that least-squares fitting using this equation is extremely unstable and heavily dependent on initial guess. No matter how accurate parameters I start with, the algorithm often goes berserk (trying 3.5e+58 instead of 1.2 etc.). It also doesn't matter which algorithm I use.
I guess there must be some mathematical preconditioning voodoo or something that I could use, but I can't find anything that works for me.
Any ideas?
Thanks!
y(x) = a/(x+b) + c
My problem is that least-squares fitting using this equation is extremely unstable and heavily dependent on initial guess. No matter how accurate parameters I start with, the algorithm often goes berserk (trying 3.5e+58 instead of 1.2 etc.). It also doesn't matter which algorithm I use.
I guess there must be some mathematical preconditioning voodoo or something that I could use, but I can't find anything that works for me.
Any ideas?
Thanks!