- #1
heroslayer99
- 33
- 6
- Homework Statement
- The Hipparcos space telescope used stellar parallax with a precision of 9.7 × 10–4 arcseconds to
determine the distance to stars. Estimate the maximum stellar distance in parsecs that could be measured using Hipparcos. Calculate the percentage uncertainty in the calculated value of the distance to Polaris A if the parallax angle is 7.5 × 10–3 arcseconds.
- Relevant Equations
- d = 1/p
First off, I will set out what I think I know so that any misconceptions of mine can be put right.
Definitions:
Precision: a quality denoting the closeness of agreement between (consistency, low variability of) measured values obtained by repeated measurements
Accuracy: A quality denoting the closeness of agreement between a measured value and the true value
Uncertainty: interval within which the true value can be expected to lie
Resolution: Smallest increment on the instrument
In the case of a single reading, the abs uncertainty is half the resolution of the instrument, in the case of a measurement (the difference between two readings) the abs uncertainty is twice that of a single reading (twice the uncertainty in 1 reading is clearly just the resolution). For digital devices, like a voltmeter, approximate the abs uncertainty as half the resolution (same as a single reading).
What confuses me greatly is that in my problem I am told that the "precision" of the telescope is 9.7 x 10^-4, but from what I already know, precision (at least at the level I am working at, and the level that the textbook is written for), cannot be quantified, so I do not know what that value means, most likely it is the resolution, and the author has made a mistake. I am also confused as to whether the abs uncertainty is the quoted "resolution" or half this value. Finally, if this is the resolution then clearly the stated value for the parallax angle would be a multiple of this, turns out it isn't.
Please help :(
Definitions:
Precision: a quality denoting the closeness of agreement between (consistency, low variability of) measured values obtained by repeated measurements
Accuracy: A quality denoting the closeness of agreement between a measured value and the true value
Uncertainty: interval within which the true value can be expected to lie
Resolution: Smallest increment on the instrument
In the case of a single reading, the abs uncertainty is half the resolution of the instrument, in the case of a measurement (the difference between two readings) the abs uncertainty is twice that of a single reading (twice the uncertainty in 1 reading is clearly just the resolution). For digital devices, like a voltmeter, approximate the abs uncertainty as half the resolution (same as a single reading).
What confuses me greatly is that in my problem I am told that the "precision" of the telescope is 9.7 x 10^-4, but from what I already know, precision (at least at the level I am working at, and the level that the textbook is written for), cannot be quantified, so I do not know what that value means, most likely it is the resolution, and the author has made a mistake. I am also confused as to whether the abs uncertainty is the quoted "resolution" or half this value. Finally, if this is the resolution then clearly the stated value for the parallax angle would be a multiple of this, turns out it isn't.
Please help :(