- #1
mesogen
- 25
- 0
Sorry if I'm in the wrong subforum.
This is a rather simple and straightforward question, I hope.
I'm doing a measurement that requires me to do a linear regression on data points to get a value of the slope. The slope is the value of the actual property that I am measuring.
Assuming no uncertainty in the data points that are being fit, can I simply use the standard deviation of the slope (output by fitting software) as the uncertainty in that measurement? Is this standard practice?
I ask because the standard deviation in the slope is quite small and results in an uncertainty that, to me, seems unreasonably small.
This is a rather simple and straightforward question, I hope.
I'm doing a measurement that requires me to do a linear regression on data points to get a value of the slope. The slope is the value of the actual property that I am measuring.
Assuming no uncertainty in the data points that are being fit, can I simply use the standard deviation of the slope (output by fitting software) as the uncertainty in that measurement? Is this standard practice?
I ask because the standard deviation in the slope is quite small and results in an uncertainty that, to me, seems unreasonably small.