Basic Principles of Instrument Calibration

Last Updated on January 27, 2023 by Electricalvolt

This article describes the basic principle of instrument calibration. In the field of instrumentation, calibration plays a very important role. Every instrument has to be calibrated at a certain interval of time. Calibration is done to ensure that the instrument will produce accurate outputs as and when required.

Definition of calibration

Instrument Calibration is the process in which the output of any instrument like a transmitter or switch is checked with respect to a given input. The input to the instrument under calibration is given using a standard Test and Calibration Instrument. Then the output of the instrument is checked by taking the input of the Test and Calibration Instrument as a reference.

The error of input vs. output is calculated. If the error obtained is more than the acceptable error as per process requirement, the calibration in terms of zero calibration and span calibration is done.

After doing zero calibration and span calibration, the response of the instrument becomes OK. If still, the response of the instrument is not OK, then we can try doing zero calibration and span calibration one more time. If still, the response of the instrument is not OK, then the instrument should be discarded and a new instrument should be used.

(Additional Information: Test and Calibration Instrument is also termed TCI which is a standard instrument in any industry used for calibrating other field instruments. The Test and Calibration Instrument is calibrated by Calibration Instrument at National Test Laboratories.)

Why do we need to do calibration of any instrument?

As we know that all the instruments which are installed in the field, are used for measuring one or the other parameter or transferring one or the other parameter.  Hence, it is very clear that every instrument will have a mechanical component as well as electronics.

The mechanical component of the instrument will sense the process parameter. While the electronics of the instrument will produce changes in the output like 4 mA to 20 mA which is required by the Control System.  

Hence, it is obvious that the mechanical component can develop any hysteresis or the electronics can also develop some error over a period of time. So, this will induce a change in the response of the instrument. This changed response of the instrument will give an incorrect output which can lead to process upset or damage to any equipment. Hence, at regular intervals, we should do calibration of all instruments to get accurate outputs from the instrument.

When should we calibrate instruments?

Each and every industry has its own philosophy of calibrating instruments at regular intervals. The calibration philosophy of every industry depends on how critical the process is and how accurate or how pure the product should be. However, as a thumb rule, we can calibrate instruments that are used for critical processes once a year. Critical analyzers should be calibrated every six months. While the instruments which are only used for monitoring parameters can be calibrated once in a two-year or three year.

Some common terms related to Instrument Calibration:

Calibration Records

One of the instrument engineer’s main responsibilities is performing the calibration of every instrument. Every instrument which is calibrated should have a calibration record. The calibration record which is created after the calibration of the instrument contains details like the date of calibration, reading before and after calibration, instrument detail (tag number, area in which instrument is installed), Lower Range Value, and Upper Range Value of the instrument, as found data and as left data, calibrating TCI used.

Calibration Range

The calibration range of an instrument is the range for which the instrument is installed for measuring the quantity. Suppose a pressure transmitter is installed for measuring the pressure from 0 kg/cm2 to 10 kg/cm2, then the calibration range of the pressure transmitter is 0 kg/cm2 to 10 kg/cm2

Span

The span of any instrument is the absolute difference between the upper range value and the lower range value of that particular instrument. For example, if any temperature transmitter measures temperature from -100 deg C to 600 deg C, then the span for this temperature transmitter will be (600-(-100)) which is 700 deg C.

Instrument Range

The range of any instrument is the maximum capability of the instrument to measure the minimum and maximum value of the physical quantity. The calibration range of the instrument is always less than the instrument range. The instrument range can be found on the nameplate provided by the vendor or in the vendor manual provided by the vendor. for example-Instrument range from 0 to 100 Deg. C; output 4-20 mA.

Zero and Span Calibration

Zero calibration and span calibration are used for calibrating the instrument. Whenever the LRV is found disturbed, then we go for zero calibration and whenever the URV is found disturbed, then we go for span calibration.

One important thing to note here is that while calibration, if we disturb/set the zero of the instrument, then the span of the instrument will also get disturbed. Or if we disturb/set the span of the instrument, then the zero of the instrument may also get disturbed. Hence it is recommended to check the zero of the instrument and the span of the instrument after every calibration and do necessary adjustments.

This holds true for instruments with zero adjustment screw and span adjustment screw. Nowadays, modern smart instruments need only one time zero calibration and span calibration if any of these is found disturbed. Performing zero calibration on smart transmitters will not affect their span. Also, performing span calibration on smart transmitters will not affect its zero.

Five Point Calibration

For checking the calibration of any instrument, we need to give input to the instrument through TCI to check the output of the instrument. Generally, 5 point calibration method is used for checking the calibration of any instrument. In 5 point calibration method, inputs like 0%, 25%, 50%, 75%, and 100% values of the span are given to the instrument and the output is noted. This is done in upscale as well as in downscales.

Two Point Calibration

2-point calibration is generally done for many analyzers like pH analyzers, conductivity analyzers, CO2, and O2 analyzers.

As Found Data and As Left Data

The readings( 0 %, 25%,50%,75%, and 100%) from the instrument that we get before calibration are termed as AS FOUND DATA. If the instrument needs calibration, then after calibration, we take one more set of readings ( 0 %, 25%,50%,75%, and 100%). These readings which we get after calibration are AS LEFT DATA.

Field calibration

The instrument mounted in the field is not removed in field calibration, and it is tested and calibrated in situ. Most of the field instruments have isolating valve manifolds and before the application of testing and calibration signal to the field instrument, the instrument is first vented to the atmosphere. The calibration under field conditions is rather difficult compared to the calibration under shop conditions. The test results of the field condition calibration may differ from the test results of the shop condition calibration.

In-shop or Bench Calibration

In Bench Calibration, the instrument is removed from the field and tested and calibrated at the bench using calibration devices to simulate the process. The instrument must be first cleaned before its calibration on the bench.

Calibrators

Calibrators are devices that are used to calibrate field instruments. The calibrators may vary in form and function according to the device for which they are designed to calibrate. The following types of calibrators are mostly used for calibration.

  • A fluidized bath and block calibrator is used for the calibration of RTDs and thermocouples.
  • Signal reference generators generate electrical quantities, such as voltage, current, and frequency. The signal generators are used to calibrate the panel meters. The output of the signal generator is fed to the instrument and the recorded value is adjusted according to the signal generator’s reading.
  • The simulator is a special type of signal generator that generates and reads the signal.
  • A pneumatic calibrator regulates the pressure required for testing pressure measurement instruments. These are generally used with a pressure source.

Traceability

Traceability ensures that the measurement is done by taking all uncertainties and that the measurement is an accurate representation of an object being measured. The traceability shows that the result of the measurement or the value of a standard is actually traceable.

Traceability is achieved when we calibrate the instrument by a higher-level reference standard. All calibrations must be traceable to nationally or internationally recognized standards.

Please follow and like us:
About Satyadeo Vyas

Satyadeo Vyas, M.Tech,M.B.A. is an electrical engineer and has more than 36 years of industrial experience in the operation, maintenance, and commissioning of electrical and instrumentation projects. He has good knowledge of electrical, electronics, and instrumentation engineering.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Want To Learn Faster?

Get electrical, electronic, automation, and instrumentation engineering articles delivered to your inbox every week.

Leave a Comment