Decoding Humidity Accuracy

May 31, 2019

Considering humidity accuracy measurements is necessary when selecting a relative humidity transmitter. Accuracy, by definition, is "the degree to which the result of a measurement, calculation, or specification conforms to the correct value or standard" (Oxford English Dictionary). When comparing accuracy measurements of different relative humidity sensors, understanding the factors included in said measurements is crucial. An accuracy measurement should consist of non-linearity, hysteresis, repeatability, and calibration uncertainty. These factors can be defined as:

  • Non-linearity: the relationship of a calibration curve to a specified straight line through its end points
  • Hysteresis: the maximum difference in output at any pressure value within the specified range, when the value is approached increasing and decreasing pressure
  • Repeatability: the ability of a transducer to reproduce output readings when the same pressure value is applied to it consecutively, under the same conditions, and from the same direction
  • Calibration uncertainty: accounts for the reference equipment used for calibrating the relative humidity transmitter

The majority of manufacturers tend to break these factors out and list them separately or report an accuracy measurement that includes non-linearity, hysteresis, and repeatability but excludes calibration uncertainty. Reporting accuracy this way makes the base accuracy appear low, but in reality it could be significantly worse. In contrast, Setra includes all of these accuracy factors in our calculations to provide a more precise accuracy measurement.

It is also important to consider the temperature at which the accuracy testing is completed. Accuracy testing within a temperature range does not mean the sensor cannot be operated outside of that range; rather it is an indication of accuracy at certain temperatures, and operation outside those temperatures can result in a different accuracy for the device.

Topics: Humidity, HVAC/R