Degrees of Freedom in a Chi-Square Test

••• dusanpetkovic/iStock/GettyImages

Statistics is the study of probability used to determine the likelihood of an event occurring. There are many different ways to test probability and statistics, with one of the most well known being the Chi-Square test. Like any statistics test, the Chi-Square test has to take degrees of freedom into consideration before making a statistical decision.

Goodness to Fit

The Chi-Square is used to test and compare two different types of data: observed data and expected data. It measures what is called the “goodness to fit” which is the difference between what you’d expect and what has been observed. For example, statistically speaking, if you flip a coin 50 times you should get 25 heads and 25 tails. However, you actually flip a coin 50 times and it lands on tails 19 times and on tails 31 times. Using this data, a statistician could theorize about why these differences occurred.

Degrees of Freedom

Degrees of freedom are the measurements of the number of values in the statistic that are free to vary without influencing the result of the statistic. Statistic tests, including the Chi-Square, are often based on very precise estimates based on various pieces of vital information. Statisticians use these estimates to create statistical formulas that calculate the final result of their statistical analysis. The information used in analysis may vary, but there must always be at least one fixed category of information; the rest of the categories are degrees of freedom. This is important because although statistics is a mathematical science, it is often based on hypotheses that can be hard to accurately compute.


Calculating degrees of freedom in the Chi-Square test is very simple. Find how many categories you have in your statistical analysis and subtract it by one. For example, imagine you are studying the expected birth rates of elephants versus the observed birth rate. The categories include the ages of the mother, the age of the father and the sex of their children being born. That gives you three categories in your study. Subtract one from that to get two as your degree of freedom. Basically, the more categories you have in your study, the more degrees of freedom you have to experiment with in later statistical analysis.


Degrees of freedom are important in the Chi-Square test because the observed results often differ significantly from the expected results, and these degrees of freedom are needed to test different hypothetical situations. Basically, you can take the data you have gathered for your analysis and reuse them to perform another statistical analysis. These new studies may help explain the differences between the expected results and the observed results more fully.

Related Articles

How to Calculate the Distribution of the Mean
How to Chi-Square Test
How to Calculate Average Deviation From the Mean
What Is the Tukey HSD Test?
How to Calculate the Confidence Interval of the Mean
How to Calculate Degrees of Freedom in Statistical...
What Statistical Analysis Do I Run When Comparing Three...
How to Calculate RSD
What Are Parametric and Nonparametric Tests?
What Does a Negative T-Value Mean?
How to Calculate a Sample Size Population
How to Interpret an Independent T Test in SPSS
How to Calculate Significance
How to Determine Sample Size
How to Calculate X-bar
How to Calculate Correlation
The Disadvantages of Linear Regression
How to Interpret Chi-Squared
How to Calculate Coefficient of Determination
How to Calculate Logit