- #1
kelly0303
- 580
- 33
Hello! Say I have some measurements ##y_i = f(x_i|a_1,a_2,...,a_n)## for different values of ##x_i##. Here ##a_i##'s are the parameters of the function I want to fit for. For example for a linear function I would just have ##y_i=ax_i+b=f(x_i|a,b)##. I want to see how the errors on the ##y_i##'s are reflected into errors on the parameters of the function. What is the best way to simulate this? I was thinking to generate some values of ##y_i## based on the function, and then replace each ##y_i## by a randomly generated number, ##y_i'##, from a Gaussian distribution with mean ##y_i## and standard deviation ##\sigma_{y_i}##. Then I would do a least square fit to ##y_i'=f(x_i)##, including the errors on ##y_i'## i.e. ##\sigma_{y_i}## and from there I would get a value for my parameters and an associated error i.e. ##a_i\pm \sigma_{a_i}##. Then I would change the values of ##\sigma_{y_i}## and repeat all this. In this way I would basically have something of the form ##\sigma_{y_i}## vs ##\sigma_{a_i} ##. I am not totally sure tho if this is the best approach. Can someone tell me how to do it? Thank you!