Many industries require exacting precision in their measurements. Whether a national laboratory or a machining workshop, operators need to know how reliable the measurements are for their tools. Organizations, such as the National Conference of Standards Laboratories or the National Institute of Science and Technology, describe the accuracy of a tool’s calibration – how precise the measurement of the tool’s precision is – using test accuracy ratios (TARs), sometimes referred to as test uncertainty ratios. Learning how to calculate test accuracy ratios allows you to ensure that you calibrate your equipment to industry standards.
Determine the tolerance of the tool. Consult the manufacturer’s technical literature to find the tool’s accuracy. For example, a manufacturer may specify that the alignment of a saw is accurate to within 1/10-inch.
Locate the tolerance of the calibration standard. Refer to the technical literature for the tool or standard if you do not have the tolerance readily available. For example, a laser distance meter might have an accuracy of 6/100-inch.
Reduce the ratio of calibration standard to tool accuracy. Divide the accuracy of the tool being calibrated by the accuracy of the calibration standard. For example, .1 divided by .006 equals 16.667. Express the result as the test accuracy ratio, such as 16.667:1.
About the Author
Sean Butner has been writing news articles, blog entries and feature pieces since 2005. His articles have appeared on the cover of "The Richland Sandstorm" and "The Palimpsest Files." He is completing graduate coursework in accounting through Texas A&M University-Commerce. He currently advises families on their insurance and financial planning needs.
Photo Credits
Stockbyte/Stockbyte/Getty Images