- #1
bluemystic
- 3
- 0
- Homework Statement
- Suppose I measure the length of something 5 times and average the values. Each measurement has its associated uncertainty. What is the uncertainty of the average?
- Relevant Equations
- SD=sqrt( sum of difference^2/(N-1) )
Standard Error=SD/sqrt(N)
Using the above formulas, we can arrive at an unbiased estimate of the standard deviation of the sample, then divide by sqrt(N) to arrive at the standard deviation of the average. What I'm confused about it where the measurement uncertainty comes into the equation. Is it being ignored? Say I take only two measurements and they turn out to be equal. Then the sample standard deviation is zero. But the true uncertainty of the average can't be 0 because of measurement uncertainty, can it?
On a side note, why can't I use error propagation of measurement uncertainties to obtain the uncertainty of the average, without considering standard deviation?
On a side note, why can't I use error propagation of measurement uncertainties to obtain the uncertainty of the average, without considering standard deviation?