Sensor data: Accuracy vs. resolution

Distinguishing between accuracy and resolution is important when determining system needs. Application note provides examples and illustrations to show the difference.
By Control Engineering Staff January 15, 2009

Accuracy is a measurement device’s degree of absolute correctness, whereas resolution is the smallest number that it can be display or record. While resolution relates to overall accuracy, there is an important difference often misinterpreted in determining system needs.

Accuracy compares a measured value to the actual value.
Accuracy compares a measured value to the actual value.

During development of system error budgets as part of system design, engineers must determine the necessary levels of accuracy for system elements such as field sensors, actuators, signal conditioning modules (SCMs), and controlling units (PCs and PLCs). In addition, software algorithm integrity and operating system compatibility, or the degree of software “openness,” must be included in error budget considerations. Resolution requires attention as it relates to overall accuracy. Oftentimes distinguishing between accuracy and resolution is misinterpreted in determining system needs. (See “What’s the difference between accuracy and precision?” — the 9/29/08 entry in the Control Engineering

Ask Charlie

blog.)
To help engineers clarify the difference, Dataforth Corp. has published a 3 page application note (Dataforth Application Note AN114), appropriately entitled “Accuracy versus Resolution,”
C.G. Masi , senior editor
Control Engineering News Desk
Register here and scroll down to select your choice of eNewsletters free.