How To Calculate Test Accuracy Ratios

Many industries require exacting precision in their measurements. Whether a national laboratory or a machining workshop, operators need to know how reliable the measurements are for their tools. Organizations, such as the National Conference of Standards Laboratories or the National Institute of Science and Technology, describe the accuracy of a tool's calibration – how precise the measurement of the tool's precision is – using test accuracy ratios (TARs), sometimes referred to as test uncertainty ratios. Learning how to calculate test accuracy ratios allows you to ensure that you calibrate your equipment to industry standards.

Advertisement

Step 1

Determine the tolerance of the tool. Consult the manufacturer's technical literature to find the tool's accuracy. For example, a manufacturer may specify that the alignment of a saw is accurate to within 1/10-inch.

Advertisement

Step 2

Locate the tolerance of the calibration standard. Refer to the technical literature for the tool or standard if you do not have the tolerance readily available. For example, a laser distance meter might have an accuracy of 6/100-inch.

Step 3

Reduce the ratio of calibration standard to tool accuracy. Divide the accuracy of the tool being calibrated by the accuracy of the calibration standard. For example, .1 divided by .006 equals 16.667. Express the result as the test accuracy ratio, such as 16.667:1.

Cite This Article

MLA

Butner, Sean. "How To Calculate Test Accuracy Ratios" sciencing.com, https://www.sciencing.com/calculate-test-accuracy-ratios-10025442/. 24 April 2017.

APA

Butner, Sean. (2017, April 24). How To Calculate Test Accuracy Ratios. sciencing.com. Retrieved from https://www.sciencing.com/calculate-test-accuracy-ratios-10025442/

Chicago

Butner, Sean. How To Calculate Test Accuracy Ratios last modified March 24, 2022. https://www.sciencing.com/calculate-test-accuracy-ratios-10025442/

Recommended

Advertisement