How to Do a Standard Deviation with Difference of Means

By Ireland Wolfe
To compute the standard deviation, you will need to figure out the difference of means.
Thinkstock/Comstock/Getty Images

A standard deviation is a unit of measurement that is used in statistics to determine how close results are to the average; it is the mean average of a group of averages when working with sets of data. To calculate standard deviation, statisticians use a formula to figure out the square root of the variance -- the difference of means in a data set. Many scientific calculators have an option to compute the standard deviation. However, you can do it by hand to learn the variance of your data.

Figure out the mean for each set of data. To do this, find the sum for each set of data and divide by the number of results. For example, if you are trying to find the average grade on a math test, add all of the grades together and divide this sum by number of people that took the test.

Find the deviation for each result. To do this, subtract the total mean of the data set from each individual result. If the mean of your test grades was 88, subtract each grade from the mean to find out the deviance.

Square each individual deviation result and add up the results.

Take the number of elements in the data set and subtract one. For instance, if you had 20 grades that you were using to compute the average, you will use 19.

Take the square root of the total of the squared deviations (from Step 3) and divide it by the square root of the data set minus one (from Step 4). The result is the standard deviation for the entire data set.