Accuracy, Precision, Resolution, and Sensitivity: What Do They Mean?

What Do Data Acquisition Specifications Actually Mean?

I got my first taste of data acquisition working in a student lab at NASA Glenn Research Center. I was working on a project for fiber optic temperature sensors, and I was tasked with working with a custom designed A/D board installed into an IBM AT computer running an Intel 286 microprocessor. It used analog switches to select channels which were sent to an instrumentation amplifier and then to the A/D converter for the measurement. There was no datasheet for it and sending it out for calibration was out of the question. This truly was the Wild West of data acquisition.

I didn’t know at the time but this single A/D board would give direction to my career, resulting in 30+ years of experience with data acquisition products.  Over that 30 years, I noticed something about the specifications that were being  published along with new products – they were beginning to include much more detail. A/D accuracy of ± 1 or 2 bits was replaced by percentage of reading alongside an offset. Inherent noise, which plays into sensitivity, followed as well, as did other specifications like linearity and drift. In this article, I would like to explain these specifications with a little more detail so you can gain context that I wish I had throughout my career. I’ll focus on four general areas: Accuracy, Precision, Resolution, and Sensitivity.

Accuracy

Accuracy is how well the measurements match up to the government standard known as NIST. NIST is the National Institute of Standards and Technology. They are the people that know how to produce an exact volt accurate to a single nanovolt or better.  To verify accuracy simply use a voltage source and a voltmeter that has been recently calibrated with equipment that is traceable back to NIST. It’s best practice to use a voltage calibrator to dial in different values. Voltage Calibrators output a stable low noise voltage. Use a good voltmeter to verify the output value.

Resolution

Next, let’s discuss ‘resolution’. Resolution refers to how many values can be represented on a scale. Think of a scale like a ruler. A highly accurate ruler (i.e., millimeter markings) that is of equal length to a low-accuracy ruler (i.e, centimeter markings) can be said to have a higher resolution than the latter by virtue of having more markings on it. These markings, the lines drawn on the ruler, are also referred to as “counts”. An absolute resolution refers to what each “step”, the actual physical distance between two ruler markings, means. The absolute resolution also depends on the range of the device – the length of the entire ruler.

For example, a 16-bit A/D has 65,536 (2 to the power of 16) steps in its scale. If you wanted to verify the functionality of a 16-bit A/D device, you’d need another device with a higher absolute resolution in order to measure the same data and compare. For example, a 6 ½ digit voltmeter would typically be recommended, which has a 2,000,000 step scale.

Similarly, a device’s theoretical (or ideal) accuracy refers to the percentage of the total range that a single step on the scale takes up. It’s the inverse of the bit counts multiplied by ten, so, for example, a 12-bit A/D (4096 counts) has an ideal accuracy of 0.024%. If you happen to be one of those who thinks in decibels, these percentages can also be directly translated into decibels – for example, a 16-bit A/D converter has an accuracy of 0.0015% of the input range, which translates to -96 decibels (20 * Log (0.000015)).

Notably, use of higher resolution A/D devices help out because the both the circuitry inside of your instrument and the circuitry you are testing can add inaccuracy to the measurement. While it may be possible to approach ideal accuracy by injecting a perfect test voltage directly to the A/D converter IC chip, usually there isn’t a direct path to the exact point you want to measure. More often, analog buffers, level shift circuitry and amplifiers will precede the A/D converter. Similarly, if a device has a small measurement range, it might be expanded through the addition of amplifiers and level shifting circuits, converting a not so practical range to a useable ±5 or 0 – 10 volts range. But the extra circuitry adds error, so the accuracy (or inaccuracy) specification is no longer 0.0015%, for example, and instead some larger value. This is one of many design tradeoffs inherent in the design of (signal acquisition devices).

Sensitivity

Sensitivity is the degree to which the input signal change is reflected in the data. A device that has a good amount of internal noise on the signal path will respond poorly to small changes and conversely, a signal path with little to no noise will easily show changes. This is because a small signal change is lost in the noise. Averaging of the data will reduce the noise, but that comes at a tradeoff of speed and may not be a feature of the device or software. Most modern data acquisition devices feature low noise circuitry, but even ± 6 counts can look large when attempting to detect a signal change that is a few 100 microvolts.

Precision

Aren’t precision and accuracy the same? An A/D converter is said to be precise if it can return consistent measurements. They may not be accurate to what is applied, but they are consistently in the same value. Think of it like a group of arrows in a target. If the grouping is tight it is said to be precise, even if they are left of center of the bullseye.

An instrument with input noise will produce a group of measurements. The measured value is at the center or average of a group and should be same value every time. Therefore,  precision is concerned with the measurement quality and not how well it matches up to a known value. Another term for this is repeatability – can it produce the same value every time? A precise instrument can often be calibrated to be an accurate instrument.  Ultimately, you will want is to find a device that has good resolution, that can return precise accurate low noise measurements and one that is affordable.

Data acquisition devices are now including signal-to-noise (SNR) specifications that provide a clue to its noise-free resolution – often referred to as ‘effective number of bits’, or ENOB. The MCC USB-1808 data sheet includes dynamic performance specifications that include SNR and ENOB among others, representing specifications that are a far cry from what I needed to know for my first A/D board encounter.

 

Author

Be the 1st to vote.

2 Comments on “Accuracy, Precision, Resolution, and Sensitivity: What Do They Mean?”

  1. Thanks for the article. Just a small spotting: ‘former’ should be corrected to ‘latter’ in the paragraph discussing resolution.

Leave a Reply to Testo India Cancel reply

Your email address will not be published. Required fields are marked *