Statistician and evolutionary biologist Ronald Fisher developed ANOVA, or analysis of variance, to be a means to an end. It can help you find out if the results of an experiment, survey or study can support the hypothesis. Using ANOVA, you can quickly decide if a hypothesis is true or false.
What Is ANOVA?
Used to evaluate the variances among group means in a sample, ANOVA is an assembling of statistical models and their related estimation procedures. It is basically the variation between two known data groups. It offers a statistical test of whether the population means of several sets of data are actually equal. It then generalizes the t-test, or an analysis of two populations means through statistical examination, to more than two groups. A t-test shows if there is a significant difference between the population mean and a hypothesized value. The size of the difference relative to the variation in the sample data is the t-value.
One Way or Two Way?
The number of independent variables in the analysis of variance test that you use determines if the ANOVA is one or the other. A one-way test has a single independent variable with two levels. A two-way analysis of variance test has two independent variables. A two-way test can have a multitude of levels. An example of a one-way would be comparing two brands of jelly. A two-way would compare brands of jelly as well as calories, fat, sugar or carbohydrate levels.
The levels include the different groups that are all in the same independent variable. Replication is when you repeat the tests with multiple groups. A two-way analysis of variance with replication uses two groups and individuals that are within that group that are doing multiple things. Two-way ANOVA tests can be completed with or without replication.
How to Do ANOVA by Hand
Statistical software is available that can quickly and easily compute ANOVA, but there is a benefit to calculating ANOVA by hand. It allows you to understand the individual steps that are involved as well as how they each contribute in showing the differences between the multiple groups.
Gather the basic summary statistics of the data that you have collected. The summary statistics include the individual data points for the first group, labeled “x,” and the number of data points for the second individual variant, “y.” The number of data points for each group is labeled “n.”
Add the points for the first group, labeled “SX.” The second group of data collected is “SY.”
To calculate the mean, use the formula, C = (SX + SY) ^2 / (2n).
Calculate the sum of the square between the groups, SSB = [(SX^2 + SY^2) / n] – C.
Once you have squared all of the data points, sum them up in a final sum of “D.”
Next, calculate the sum of squares total, SST = D -- C.
Use the formula SST – SSB to find the SSW, or the sum of squares within groups.
Figure the degrees of freedom for between the groups, “dfb,” and within the groups, “dfw.”
The formula for between groups is dfb = 1 and for the within groups it is dfw = 2n-2.
Compute the mean square for the within groups, MSW = SSW / dfw.
Finally, compute the final statistic, or “F,” F = MSB / MSW