How to select the required accuracy of an instrument?

In summary, the speaker is seeking advice on how to determine the required accuracy for a measuring instrument when a tolerance is not specified. They give an example of a valve stem diameter and mention the Test Accuracy Ratio (TAR) of 4:1. They also mention the Guide to the Expression of Uncertainty in Measurement (GUM) and suggest a possible tolerance of +/-0.0005mm. Another speaker recommends looking up the standards document for specific guidelines and considering the traceability of instruments. The accuracy for the valve stem diameter example is discussed, with a typical required clearance of 0.001 inch or 0.025 mm. The machining industry has standard tolerances based on the number of decimal places, with 0.005
  • #1
fonz
151
5
TL;DR Summary
Instrument accuracy and precision
If I need to make a measurement to check compliance e.g. measuring component dimensions. How do I know what accuracy is required for the measuring instrument if the tolerance is not specified?

For example, during an engine rebuild, the manufacturer specifies the valve stem diameter to be 5.973mm. Clearly the instrument needs a resolution of at least 0.001mm, but is that enough?

I am aware that in most cases, where a tolerance is specified it is typical to aim for at Test Accuracy Ratio (TAR) of 4:1. In this case the tolerance is not specified so is it correct to assume the tolerance is +/-0.0005mm? Therefore the instrument must have an accuracy of at least +/-0.000125mm?

Thanks
 
Engineering news on Phys.org
  • #2
Paging @Ranger Mike, he probably has more direct knowledge than the below.

This will probably answer more questions than you thought existed!

GUM: Guide to the Expression of Uncertainty in Measurement


https://www.bipm.org/documents/2012...f-3f85-4dcd86f77bd6?version=1.7&download=true

(above found with:
https://www.google.com/search?&q=G.U.M.+guide+to+measurement)

Have fun, it's 'only' 134 pages.

Cheers,
Tom

p.s. It has been years since I read that document, but your accuracy assumption seems reasonable for a 'failsafe', can't tolerate ANY failure, situation.
 
  • Like
Likes sysprog
  • #3
There may be more tolerance in one direction than in the other, too ##-## e.g. a machine screw that's a tiny bit too small could be tolerably loose for its intended purpose, but one that's by the same amount a bit too big might just plain not fit without damaging the threads ##-## I think that for engine rebuild purposes, it would be helpful if the spec sheet were to state the absolute and recommended tolerances, along with the target value ##-## @Mark44 has a great deal of actual experience in rebuilding engines ##-## e.g. see this thread (I think that for him that's more of a personal pursuit than a primary profession, as he's also a programming expert and professor, which I think is his primary occupation) and he may have some insights to offer in this matter, as he's a very insightful and helpful person . . . :wink:
 
Last edited:
  • #4
If is a part that is supposed to adhere to a standards (or at least a technical specification) the answer would be to look up what it says in the standards document.
Yes, you can make assumption just based on the accuracy they are requesting, but the full standard is also likely to specify exactly HOW the measurement should be done and in what environment (in this case temperature would probably play a role).
Also, don't forget about the traceability of the instruments.
 
Last edited:
  • #5
I realize your valve stem is more of an example, but... The important feature of the valve stem diameter is the clearance between the stem and the valve guide. Typical required clearance is on the order of a thousandths of an inch; say 0.001 inch or 0.025 mm. Nowhere near the 0.001 mm discussed. As @f95toli said, a diameter spec to 0.001 mm (0.00004 inch) would need a corresponding temperature value (holding the valve in you hand would warm the valve and change the diameter by more than 0.001 mm).
 
  • Like
Likes sysprog
  • #6
When no other tolerances are provided, the https://en.wikipedia.org/w/index.php?title=Machining_industry&action=edit&redlink=1 uses the following standard tolerances:[3][4]

1 decimal place(.x):±0.2"
2 decimal places(.0x):±0.01"
3 decimal places(.00x):±0.005"
4 decimal places(.000x):±0.0005"
See https://en.wikipedia.org/wiki/Engineering_tolerance

So it would seem that for your case the tolerance is 0.005mm.
 
  • Informative
  • Like
Likes sysprog and berkeman

FAQ: How to select the required accuracy of an instrument?

What factors should be considered when selecting the required accuracy of an instrument?

When selecting the required accuracy of an instrument, there are several factors that should be taken into consideration. These include the intended use of the instrument, the level of precision needed for the measurements, the environment in which the instrument will be used, and the cost of the instrument.

How do I determine the level of precision needed for my measurements?

The level of precision needed for measurements depends on the specific application and the desired level of accuracy. This can be determined by considering the tolerances and uncertainties associated with the measurements, as well as the potential consequences of any errors in the measurements.

What is the difference between accuracy and precision in an instrument?

Accuracy refers to how close a measurement is to the true or accepted value, while precision refers to the consistency and reproducibility of the measurements. An instrument can be precise but not accurate, or accurate but not precise.

Is it better to choose an instrument with higher accuracy or higher precision?

It depends on the specific needs of the application. In some cases, a higher level of accuracy may be more important, while in others, precision may be the top priority. It is important to consider both factors when selecting an instrument.

How can I ensure the accuracy of an instrument over time?

To ensure the accuracy of an instrument over time, regular calibration and maintenance are necessary. This involves comparing the instrument's measurements to a known standard and making any necessary adjustments. It is also important to follow the manufacturer's instructions for proper use and storage of the instrument.

Similar threads

Back
Top