Академический Документы
Профессиональный Документы
Культура Документы
ME 341 (3+0)
Objectives of todays lecture
Explain Error and its types for different
measurement.
Explain Calibration.
Calibration
Calibration is a comparison between measurements one
of known magnitude or correctness made or set with one
device and another measurement made in a similar way as
possible with a second device. The device with the known or
assigned correctness is called the standard. The second
device is simply the test instrument.
In general use, calibration is often regarded as including the
process of adjusting the output or indication on a
measurement instrument to agree with value of the applied
standard, within a specified accuracy.
Calibration methods
Calibration procedures involve a comparison of the
particular instrument with either:
1. A primary standard
2. A secondary standard with a higher accuracy
than the instrument to be calibrated
3. A known input source
Error
Error in Measurement: The difference between
measured and true value of a measured is termed as
error in measurement.
Error = measured value - true value
Uncertainty
(or Uncertainty Interval):
Uncertainty is an estimate of the limits of error in
measurement. This is always stated with some level
of confidence.
Types of error
Gross errors: largely human errors, among them
misreading of instruments, incorrect adjustment and
improper application of instruments, and computational
mistakes.
Random Error (Precision Error or Noise): Error caused
by any factors that randomly affect measurement of the
variable across the sample. Random error are
characterized by lack of repeatability in output of
measuring system. Random error adds to variability in
data but does not affect average of the group.
Random Error = reading average of reading