- #1
JohnayG
- 5
- 0
Hello,
I am trying to work out the total uncertainty for a measurement, so am using the square root of the reading uncertainty squared + calibration uncertainty squared + random uncertainty squared equation.
I have found in a textbook of mine that the calibration uncertainty for a digital meter (which i used to measure all my results for voltage) is plus or minus 0.5% of the reading + 1 in the least significant digit. Now i have made up tables with all my results for V already, but does this mean i have to put in a calibration uncertainty for every single measurement made; i.e. V at 10ms, V at 30ms etc? And also how would i work out a value to use for the whole calibration uncertainty to use in the equation?
Thanks very much for your time
I am trying to work out the total uncertainty for a measurement, so am using the square root of the reading uncertainty squared + calibration uncertainty squared + random uncertainty squared equation.
I have found in a textbook of mine that the calibration uncertainty for a digital meter (which i used to measure all my results for voltage) is plus or minus 0.5% of the reading + 1 in the least significant digit. Now i have made up tables with all my results for V already, but does this mean i have to put in a calibration uncertainty for every single measurement made; i.e. V at 10ms, V at 30ms etc? And also how would i work out a value to use for the whole calibration uncertainty to use in the equation?
Thanks very much for your time