- #1
richardc
- 7
- 1
Finding the Uncertainty of the Slope Parameter of a Linear Regression
Suppose I have measurements [itex]x_i \pm \sigma_{xi}[/itex] and [itex]y_i \pm \sigma_{yi}[/itex] where [itex]\sigma[/itex] is the uncertainty in the measurement. If I use a linear regression to estimate the value of [itex]b[/itex] in [itex]y=a+bx[/itex], I'm struggling to find a straightforward way to compute the uncertainty of [itex]b[/itex] that arises from the measurement uncertainties. This seems like it should be a very common problem, so I'm not sure why I can't find a simple algorithm or formula.
Thank you for any advice.
Suppose I have measurements [itex]x_i \pm \sigma_{xi}[/itex] and [itex]y_i \pm \sigma_{yi}[/itex] where [itex]\sigma[/itex] is the uncertainty in the measurement. If I use a linear regression to estimate the value of [itex]b[/itex] in [itex]y=a+bx[/itex], I'm struggling to find a straightforward way to compute the uncertainty of [itex]b[/itex] that arises from the measurement uncertainties. This seems like it should be a very common problem, so I'm not sure why I can't find a simple algorithm or formula.
Thank you for any advice.
Last edited: