- #1
fog37
- 1,569
- 108
- TL;DR Summary
- Understand relation between precision and the number of significant figures in a number that is the result of a measurement.
Hello,
When we measure something, using a measuring instrument, our measurement will be a number with a finite number of significant figures. The rightmost digit is the uncertain digit affected by uncertainty. A measurement should always be reported as the best estimate +-error range where the best estimate is the average of multiple measurements...
The more precise is the instrument, the more precise is the measurement, the more sig figs in the measurement. So far so good. So precision is reflected by the number of sig figs.
However, the other definition of precision is how close multiple measurements are to each other...This definition does not seem to relate precision to sig figs...What am I missing?
Are measurements with many sig figs also likely to be very close numerically to each other?
Thanks!
When we measure something, using a measuring instrument, our measurement will be a number with a finite number of significant figures. The rightmost digit is the uncertain digit affected by uncertainty. A measurement should always be reported as the best estimate +-error range where the best estimate is the average of multiple measurements...
The more precise is the instrument, the more precise is the measurement, the more sig figs in the measurement. So far so good. So precision is reflected by the number of sig figs.
However, the other definition of precision is how close multiple measurements are to each other...This definition does not seem to relate precision to sig figs...What am I missing?
Are measurements with many sig figs also likely to be very close numerically to each other?
Thanks!