- #1
shaun_598
- 8
- 0
Hi,
I hope this is in the correct section, please feel free to move it or advise me to move if it's in the incorrect section.
I work as an aerospace engineer and a sideline of what i do is to maintain the torque wrenches we use on the equipment. We have a set schedule for maintenance every 3 months on where we test them with a 10% tolerance using a accratorque rig.
We get told like I am sure most others do, that you should return the torque wrench to its lowest setting to prevent the wrench from becoming inaccurate. However, I've recently left some of the wrenches set at their in use settings, some at 50-75% of their max setting. I've found that upon testing them at the 3 month intervals they showed very little discrepancies just like those returned to the low settings.
Im curious what is the theory behind winding them back to zero? And can anyone explain why there was no difference when i left them set?
Thanks
I hope this is in the correct section, please feel free to move it or advise me to move if it's in the incorrect section.
I work as an aerospace engineer and a sideline of what i do is to maintain the torque wrenches we use on the equipment. We have a set schedule for maintenance every 3 months on where we test them with a 10% tolerance using a accratorque rig.
We get told like I am sure most others do, that you should return the torque wrench to its lowest setting to prevent the wrench from becoming inaccurate. However, I've recently left some of the wrenches set at their in use settings, some at 50-75% of their max setting. I've found that upon testing them at the 3 month intervals they showed very little discrepancies just like those returned to the low settings.
Im curious what is the theory behind winding them back to zero? And can anyone explain why there was no difference when i left them set?
Thanks