Error of a measurements taken with a multimeter

In summary: But I don't know how to do that and I'm not sure if it is worth the effort.In summary, the company calculates the errors for digital multimeters by considering the tolerances of the components and the range of temperatures the meter is specified to operate over.
  • #1
MementoMori96
13
0
Hi, normally when we take a measure with a multimer we consider the error given by this type of tables:

http://www.transcat.com/media/pdf/mete35xp.pdf

Do you know how these errors are calculated?
 
Physics news on Phys.org
  • #3
No i’m asking how these errors are calculated. How does the company estimated these errors ?
 
  • #4
MementoMori96 said:
No i’m asking how these errors are calculated. How does the company estimated these errors ?
Presumably the are calculated based on the known tolerances of the components of the meter
 
  • #5
MementoMori96 said:
No i’m asking how these errors are calculated. How does the company estimated these errors ?
They are by design, and subsequent design validation.

For example, during design you decide whether to use 1% tolerance resistors, or 0.1% tolerance resistors. The 0.1% resistors are more expensive, so you only use them in places that affect the overall accuracy.

You also typically would design custom ICs for such a product. And in those custom ICs there are design techniques to improve accuracy and matching of component values. The more sophisticated you make the IC, the better the accuracy specs that you can publish. But the tradeoff is that it takes extra silicon area to do all that matching (and sometimes trimming and calibrating), which makes the product more expensive.

So early in the design specification stage of a product, you figure out where you want to position such a product on the price/performance curve in the market, and then design accordingly.

Does that help? :smile:
 
  • #6
Thanks . Is possibile that these errors contain also systematic errors ?
 
  • #7
MementoMori96 said:
Thanks . Is possibile that these errors contain also systematic errors ?
As opposed to what? Random errors from thermal noise?
 
  • #8
I consider the error that i calculate thanks to these tables a random error, is a mistake?
 
  • #9
Part of the manufacturing and ongoing maintenance process is to calibrate the DMM against a known standard of superior specification, typically, one with at least 4 times less error that the meter function being calibrated. I don't know if his name has resonance today, but one of the pioneers of electrical technology was Edward Weston who developed a saturated cadmium cell that generated an extremely stable voltage, and was an NIST voltage standard up to the 1990s.

If you are interested in metrology in general, "Standard Cells: Their Construction, Maintenance, and Characteristics", Walter Hamer, 1965, has a history of how voltage standards were developed, although these days the NIST standard is based on Joesphson junctions instead of electochemical cells.

https://www.nist.gov/calibrations/voltage-measurements-calibrations
 
  • Like
Likes berkeman
  • #10
MementoMori96 said:
I consider the error that i calculate thanks to these tables a random error, is a mistake?
It depends on the instrument and the thing being measured, but for a DVM I would think that the errors are more systematic rather than random. You mainly get random errors when thermal noise is significant in a measurement, like with a picoammeter. For standard ADC measurements, the errors would tend to be more systematic.

For example, for a data acquisition system that I designed recently, I had offsets and gains that we calibrate at manufacturing test time, to ensure that we stay within the error tolerance numbers we publish. After that calibration procedure, I get a fairly straight line for my ADC transfer characteristic, but the line may still have a little offset and/or a little gain error (but it stays within the allowed error bands). It is not a random function within those error bands, since thermal noise is not a problem for this ADC system.
 
  • Like
Likes Asymptotic
  • #11
MementoMori96 said:
I consider the error that i calculate thanks to these tables a random error, is a mistake?
If your DMM has not been calibrated specially (you pay more for that) then the probable errors are what is in your table and they will be due to the tolerances of all the components in the instrument and the range of temperatures it is specified to operate over. A number of different instruments will have been tested to back that up. As Berkeman says, the errors are more likely to be systematic but there will be a random element from instrument to instrument. You could calibrate your own at some chosen points and it would indicate that your meter consistently reads about 2% high on one particular range, for example, even if the table tells you +/- 5%.
 
  • Like
Likes Asymptotic and berkeman

Related to Error of a measurements taken with a multimeter

1. What is the "error" in a multimeter measurement?

The "error" in a multimeter measurement refers to the difference between the actual value of the quantity being measured and the value displayed on the multimeter. This difference can be caused by various factors such as instrument limitations, environmental conditions, and human error.

2. How is the error calculated in a multimeter measurement?

The error in a multimeter measurement is usually expressed as a percentage of the full-scale reading or as a percentage of the actual value being measured. It is calculated by taking the difference between the measured value and the true value, divided by the true value, and multiplied by 100.

3. What factors can contribute to the error in a multimeter measurement?

There are several factors that can contribute to the error in a multimeter measurement, including the accuracy and precision of the multimeter, the quality and condition of the test leads, the stability and calibration of the instrument, and the skill and technique of the person taking the measurement.

4. How can the error in a multimeter measurement be minimized?

To minimize the error in a multimeter measurement, it is important to use a high-quality, calibrated multimeter and test leads. It is also important to ensure that the instrument is stable and properly calibrated before taking measurements. Additionally, following proper measurement techniques and taking multiple readings can help reduce the error.

5. Is there a way to eliminate the error in a multimeter measurement?

No, it is not possible to completely eliminate the error in a multimeter measurement. However, by using high-quality instruments, following proper techniques, and taking multiple readings, the error can be minimized and kept within an acceptable range for accurate measurements.

Similar threads

  • Classical Physics
Replies
6
Views
2K
Replies
1
Views
3K
  • Other Physics Topics
Replies
5
Views
2K
  • Introductory Physics Homework Help
Replies
4
Views
676
Replies
7
Views
634
Replies
3
Views
1K
Replies
21
Views
3K
  • Introductory Physics Homework Help
Replies
1
Views
1K
  • Mechanical Engineering
Replies
5
Views
1K
Replies
6
Views
815
Back
Top