Introduction
Have you ever wondered what the accuracy and uncertainty specifications in our calibrations mean? If you require an ISO17025 certificate for your instrument (E.g., Thermal Chuck, temperature wafer, Oven, etc.), you will receive a certificate that mentions something like “ calibrated at k=2”. You may also wonder why the uncertainty value is larger than the accuracy value or why uncertainty seems to matter more than accuracy. This post intends to help you understand these three potential sources of confusion and will help you make the right decision when assessing your temperature calibration needs.
Uncertainty vs. Accuracy
First, let’s begin by defining accuracy in the temperature calibration context. Accuracy may be defined as “a measure of a calibration product’s performance and quality” (Bucy, 2019). While accuracy may be the most common quality indicator, it is not often the best term to use. Sometimes it is better to use the term uncertainty to determine the quality of the calibration.
These definitions beg the question, should one use these two terms interchangeably? The short answer is no. Accuracy is the proximity a reading is to its actual value, whereas uncertainty relates to the outliers and anomalies that may skew accuracy readings (Bucy, 2019).
Why Are Uncertainty & Accuracy Important?
Uncertainty is the degree of statistical dispersion of the temperature points one measure. Temperature is not the only parameter that is subject to uncertainty. All parameters are subject to uncertainty. So, why is determining accuracy so important?
As mentioned before, accuracy tells us how close a measurement is to its actual value, whereas uncertainty considers the outliers and anomalies that skew the accuracy readings. These outliers are products of “anomalies, adjustment, or other factors” (Bucy, 2019). These anomalies are not factored directly into the instrument’s accuracy to avoid misleading the reader. To provide a better indicator of an instrument’s performance, one should take the uncertainty values as a whole and calculate them as a component of accuracy.
In addition, one must calculate the deviation in a reading to determine measurement uncertainty better. The deviation is the difference between measure values and the actual or expected value. For instance, the limited resolution or error in reading leads to the measurement of uncertainty on display. The deviation essentially represents the random and systematic components of a measure. Since the accuracy is proportional to the deviation, one can expect that the greater the deviation, the higher the measurement uncertainty. Thus, the less accurately the instrument works.
At Sigma Sensors we are able to measure uncertainty and accuracy. Our Lab2Go program allows you to remotely do so. Learn more about Lab2Go here!