Errors such as faulty instruments, premises or observations can arise from several causes in math and science. Determining the percentage of error can express how precise your calculations have been. You need to know two variables: the estimated or predicted value and the known or observed value. Subtract the former from the latter, then divide the result by the known value and convert that figure into a percentage. In this formula, Y1 represents the estimated value and Y2, the known value: [ (Y1-Y2) /Y2 ] x 100 percent.
Applying the Formula
The University of Iowa Department of Physics and Astronomy’s lab manual provides a historical example of percentage of error: Ole Romer’s calculation of the speed of light. Romer estimated light speed as 220,000 kilometers per second, although the actual constant is much higher, 299,800 kilometers per second. Using the formula above, you may subtract Romer’s estimate from the actual value to get 79,800; dividing that result into the actual value gives the result .26618, which equates to 26.618 percent. More mundane applications of the formula might be predicting high temperatures for a week, then comparing this prediction to the actual, observed temperatures. Social scientists and marketers may also use the formula; for instance, you might predict that 5,000 people attend a public event, then compare that to the 4,550 people who actually attended. The percentage error in this case would be minus-9 percent.