Definition of Accuracy, Precision, Resolution, Range

In the field of Instrumentation and Control, a few terms like accuracy, precision, resolution, and range play a very important role. Sometimes, many engineers get confused in understanding the actual meaning of these words. Let us have a detailed look into all these words.

Define Accuracy: Accuracy of an instrument shows that how accurate it the instrument’s measured value is when compared to accrual value.

Precision: Precision of an instrument is that how precise are the measured values when taken at intervals for the same values.

Resolution: Resolution of an instrument is the least incremental or decremental change in the measured value which can be detected by an instrument.

Range: Range of an instrument is the difference between the minimum and the maximum value which can be measured by that instrument.

Definition of Accuracy, Precision, Resolution, Range

Now, we will discuss the above in detail

Accuracy

Accuracy in simple terms is how accurate or how near the measured values by the instrument will be when compared to the actual values. More accuracy means the measured value will be near the actual value. Less accuracy means the measured value will be far away from the actual value.

For any instrument, accuracy is the most important term. Whenever we go to the selection of any instrument, we first see the accuracy of that instrument. Because if the instrument does not have the desired accuracy as per the process requirement, then the measured parameter’s value will not be accurate. This can disturb the process or the product of the process will not be good.

The accuracy of an instrument is expressed in terms of percentage. The reference for calculating the accuracy is the actual value. Generally, in the instrumentation & control field, the accuracy of field instruments is less than 1%. This means that if the pressure transmitter’s span is 100 psi, then the measured value will be between 99 psi to 101 psi (provided that the accuracy of the pressure transmitter is +/- 1%). For knowing the accuracy of field instruments, every field instrument is calibrated at some desired period as per the standard of the industry.

Even the Test and Calibration Instruments (TCI) used for calibrating the field instruments are calibrated every year.

When we select a higher range, then the accuracy of the instrument automatically decreases and, for lower range selection, the accuracy increase.

Precision

The precision of an instrument is how near are the values when measured by the same instrument at different intervals of time. It is also a very important term when we use it for an instrument. Because a less precise instrument will produce different values when given the same inputs. This becomes more dangerous compared to an instrument that is less accurate.

Suppose we measure temperature using 2 different temperature transmitters at different intervals of time. The data collected are shown below:

Actual ValueTemperature by Transmitter 1Temperature by Transmitter 2
100100.56100.01
100100.57100.2
100100.5599.7
100100.5699.9

From the above table, we can conclude that the temperature measured by temperature transmitter 1 is more precise than the temperature measured by temperature transmitter 2.

Resolution

The resolution of an instrument is simply the least change that can be detected by an instrument. Suppose the instrument can detect a minimum change of 0.1 psi, then the resolution of the instrument is 0.1 psi. For a given instrument, the more the resolution, we can also say that the readings are more accurate. Let us understand this by an example.

Suppose the resolution of transmitter 1 is 0.1 psi and of the transmitter 2 is 0.01 psi. Below is the table of readings taken by the same instrument having differently configured precisions:

Actual ValuePressure by Transmitter 1Pressure by Transmitter 2
2020.120.12
2525.325.33
3030.130.11
3535.035.00

From the above table, it is very clear that the instrument having more resolution is always a better choice.

Range

The range of an instrument is the difference between the maximum and minimum value which can be measured by an instrument. When we talk of range, two main terms come into the picture. one is Upper Range Value and another is a Lower Range Value. The upper Range Value is the maximum value that the instrument can measure. While the Lower Range Value of an instrument is the lowest value that the instrument can measure. The range and resolution of the digital multimeter are as given below.

RangeResolution
200.0 mV0.1 mV
5.000 V0.001 V( 1 mV)
50.00 V0.01 V( 10 mV)
100.0 V0.1 V(100 mV)
500 V1V(1000 mV)

We can select the range of a transmitter which we also call span. When we select a lower range, then the accuracy of the instrument automatically increases.

Leave a Comment