Вы находитесь на странице: 1из 16

CALIBRATION

{ CHAPTER 13
INSTRUMENTATION AND CONTROL
CALIBRATION DEFINED
- The comparison of an ensemble of unverified uncertainty to a
calibration ensemble of quantified uncertainty to detect or correct
any deviation from required performance specifications

- Determination of the experimental relationship between the


quantity being measured and the output of the device that
measured it; where the quality measured is obtained through a
standardizing laboratory

- The process of comparing the response of an instrument to agree


with a standard instrument over the measurement range
THE IMPORTANCE OF CALIBRATION
-calibration is an essential element of instrument
performance that compares instrument inputs and
outputs against known standards. Since instruments
operate within a larger system in sequential loops, no
instrument within such loop can perform its assigned
function in the process unless it receives accurate
information from other instruments within a loop. An
instrument that is not properly calibrated cannot send
accurate information to the next instrument or control
element in the sequence.
ACCURACY AND REPEATABILITY

- Accuracy is the measurement of how closely inputs match outputs.


Instrument accuracy is a function of the difference or deviation
between input and output and is typically recorded as a percentage
of the signal span.

- Accurate calibration is dependent upon the comparison of


instrument input to output signals. A properly calibrated
instrument, therefore, will output a signal; that is 30% of its output
span if it is sent an input signal that is 30% of the input span.

ACCURACY = (DEVIATION/SPAN) X 100%


SPAN

- Refers to the difference between upper and lower range limits.

SPAN = UPPER RANGE VALUE – LOWER RANGE VALUE


GAIN

- Gain is normally expressed as a function of signal unit per input unit (e.g.
, mA/ psi or mA/’C). A high gain thus restricts input range but a
decreased ability to calibrate accurately.

GAIN = SIGNAL SPAN / INPUT SPAN

DEVIATION

- The difference between the value of a specific variable and some desired
value, usually a process set point.

DEVIATION = MEASURED OUTPUT – EXPECTED OUTPUT


CALIBRATION CHECKLIST
• Identify the function of the instrument in the system and
the instrument specifications.
• Select the necessary test equipment
• Input measurement standard
• Output measurement standard
• Connect the test equipment/s to the instrument under
test.
• Determine test points for a five-point check.( 0%, 25%,
50%, 75%, 100% of instrument span)
• Calculate the expected output for each test point.
• Perform the five point check upscale and downscale and
record the results.
• Identify the point of greatest deviation and calculate the
accuracy of the instrument. Compare to specifications to
determine whether the instruments requires calibration.
• Identify the instrument errors if necessary, adjust the
zero.
• Adjust the span.
• Recheck the zero and span [and adjust if necessary.]
• Perform another five-point check upscale and downscale
and record the results. Calculate final accuracy and
compare to instrument specifications.
TEST PROCEDURES

-the purpose of the test procedures, as described herein, is to


illustrate and clarify accuracy-related terms. It is intended only that
the procedures indicate a generalized method of test.

The test procedures that follow are for the following terms:
• Measured accuracy
• Dead band
• Point drift
• Hysteresis
• Independent linearity
• Terminal based linearity
• Zero based linearity
• Repeatability
• Reproducibility
MEASURED ACCURACY

-measured accuracy may be determined from deviation values of a number


of calibration cycles. It is the greatest positive and negative deviation of the
recorded values ( from both upscale and downscale output traverse) from
the reference or zero deviation line. Measured accuracy may be defined as a
plus and minus percent of ideal output span.

DEAD BAND

-maintain test conditions and precondition the test device, the increment
through which the input signal is varied (difference between steps 2 and 4)
is the dead band. The maximum value is reported. The dead band should
be determined at a number of points to make certain that the maximum
dead band has been observed. Dead band may be expressed as a percent of
input span.
POINT DRIFT

-in evaluating the results of this test it is presumed that the dead band
is either negligible or of such a nature that it will not affect the value of
drift. Point drift is the maximum change in recorded output value
observed during the test period. It is expressed in percent of ideal
output scan for a specified time period.

HYSTERESIS

-results from the inelastic quality of an element or device. Its effect is


combined with the determined directly from the deviation values.
Hysteresis then is determined by subtracting the value of dead band
from the corresponding value of hysteresis plus dead band for a given
input. The difference may be expressed as a percent of ideal output
scan.
INDEPENDENT LINEARITY

-independent linearity is the maximum deviation between the


average deviation curve and the straight line. It is determined from
the deviation plots of a number of calibration cycles. It is measured in
terms of independent nonlinearity as a plus or minus percent of ideal
output scan.

TERMNAL BASED LINEARITY

-is the maximum deviation between the average deviation curve and
the straight line. It is determined from the deviation plots of a number
of calibration cycles. It is measured in terms of terminal based
nonlinearity as a plus minus percent of ideal output scan.
ZERO-BASED LINEARITY

-is the maximum deviation between the average deviation curve the straight
line.it is determined from the deviation plots of a number of calibration cycles.
It measured in terms of zero based linearity as a plus or minus percent of ideal
outputs.

REPEATABILITY

-repeatability may be determined directly from the deviation values of a


number of calibration cycles. It is the closeness of agreement among a number
of consecutive measurements of the output for the same value of input
approached from the same direction. Repeatability is the maximum difference
in percent deviation observed above and is expressed as a percent of output
span.

REPRODUCIBILITY

-is the maximum difference between recorded output values(both upscale and
downscale) for a given input value. The difference it expressed as a percent of
output span per specified time interval.

Вам также может понравиться