Troubleshooting Turbine Flowmeter Calibration

AI Thread Summary
Calibration of turbine flowmeters for fuel rigs is challenging, particularly in establishing a reliable process for determining the line of best fit and associated uncertainty. The current method involves taking 20 readings (10 ascending and 10 descending) and applying the least squares method for calibration, with specific criteria for outliers. There is a suggestion to utilize the Grubb outlier method and to calculate the mean and standard deviation for each reading to assess uncertainty accurately. The process must remain straightforward for shop floor personnel, and the use of LabVIEW for data capture and analysis is noted. The key question remains whether multiple readings at each point are necessary to calculate mean and standard deviation for uncertainty estimation.
robsmith82
Messages
17
Reaction score
0
Hi,

I'm having a lot of trouble trying to set up a process for calibrating turbine flowmeters for use in fuel rigs.

Basically, I have a calibrated "master" flowmeter in series with the uncalibrated turbine flowmeter. I need a process that will give me the line of best fit across the flowmeters range, and also state its uncertainty to a given confidence level (probably 95%)

At the minute, our process isn't very good. We take 10 points going up the range and 10 points down, then take the line of best fit using the least squares method, which gives you your gain and offset for calibration. Then we say, if any point is more than 0.5% of full scale from the line, the calibration has failed. We take outliers into account by saying you are allowed 2 points between 0.5% and 1%, but not at the top and bottom limits and not consecutive. I think we should be using the grubb outlier method.

What I think I have to do is find the mean and SD for each value up the range, then use the worst case SD to calculate the uncertainty. This needs to be carried out by shop floor personnel, so can't be too complicated. We are using labview to capture the data, plot the line and show "pass" or "fail"

Help me out guys!
 
Engineering news on Phys.org


A 10 point cal is pretty standard for most flow meters. I do believe that you can go so far as to request viscosity calibrations as well. What exactly is it that you are having the trouble with?

We use Cox flow meters (among others). They have a nice cal set up for their meters.
http://www.cox-instruments.com/calibration.html
 
Last edited by a moderator:


What I'm really struggling with is whether I need to do the same point a number of times to get the mean and SD for that point to be able to find and state the uncertainty, or whether I just take one reading per point.
 
I have Mass A being pulled vertically. I have Mass B on an incline that is pulling Mass A. There is a 2:1 pulley between them. The math I'm using is: FA = MA / 2 = ? t-force MB * SIN(of the incline degree) = ? If MB is greater then FA, it pulls FA up as MB moves down the incline. BUT... If I reverse the 2:1 pulley. Then the math changes to... FA = MA * 2 = ? t-force MB * SIN(of the incline degree) = ? If FA is greater then MB, it pulls MB up the incline as FA moves down. It's confusing...
Hi. I noticed that all electronic devices in my household that also tell time eventually lag behind, except the ones that get synchronized by radio signal or internet. Most of them are battery-powered, except my alarm clock (which runs slow as well). Why does none of them run too fast? Deliberate design (why)? Wrong temperature for quartz crystal? Decreasing battery voltage? Or just a coincidence?
Back
Top