How To Interpret Hierarchical Regression
Hierarchical regression is a statistical method of exploring the relationships among, and testing hypotheses about, a dependent variable and several independent variables. Linear regression requires a numeric dependent variable. The independent variables may be numeric or categorical. Hierarchical regression means that the independent variables are not entered into the regression simultaneously, but in steps. For example, a hierarchical regression might examine the relationships among depression (as measured by some numeric scale) and variables including demographics (such as age, sex and ethnic group) in the first stage, and other variables (such as scores on other tests) in a second stage.
Interpret the First Stage of the Regression.
Step 1
Look at the unstandardized regression coefficient (which may be called B on your output) for each independent variable. For continuous independent variables, this represents the change in the dependent variable for each unit change in the independent variable. In the example, if age had a regression coefficient of 2.1, it would mean that the predicted value of depression increases by 2.1 units for each year of age.
For categorical variables, the output should show a regression coefficient for each level of the variable except one; the one that is missing is called the reference level. Each coefficient represents the difference between that level and the reference level on the dependent variable. In the example, if the reference ethnic group is "White" and the unstandardized coefficient for "Black" is -1.2, it would mean that the predicted value of depression for Blacks is 1.2 units lower than for Whites.
Step 2
Look at the standardized coefficients (which may be labeled with the Greek letter beta). These can be interpreted similarly to the unstandardized coefficients, only they are now in terms of standard deviation units of the independent variable, rather than raw units. This may help in comparing the independent variables with one another.
Step 3
Look at the significance levels, or p-values, for each coefficient (these may be labeled "Pr >" or something similar). These tell you whether the associated variable is statistically significant. This has a very particular meaning that is often misrepresented. It means that a coefficient this high or higher in a sample of this size would be unlikely to occur if the real coefficient, in the entire population from which this is drawn, was 0.
Step 4
Look at R squared. This shows what proportion of the variation in the dependent variable is accounted for by the model.
Interpret Later Stages of the Regression, the Change, and the Overall Result
Step 1
Repeat the above for each later stage of the regression.
Step 2
Compare the standardized coefficients, unstandardized coefficients, significance levels and r-squareds in each stage to the previous stage. These might be in separate sections of the output, or in separate columns of a table. This comparison lets you know how the variables in the second (or later) stage affect the relationships in the first stage.
Step 3
Look at the entire model, including all the stages. Look at the unstandardized and standardized coefficients and the significance levels for each variable and the R squared for the whole model.
Warning
This is a very complex subject.
Cite This Article
MLA
Flom, Peter. "How To Interpret Hierarchical Regression" sciencing.com, https://www.sciencing.com/interpret-hierarchical-regression-8554087/. 7 June 2011.
APA
Flom, Peter. (2011, June 7). How To Interpret Hierarchical Regression. sciencing.com. Retrieved from https://www.sciencing.com/interpret-hierarchical-regression-8554087/
Chicago
Flom, Peter. How To Interpret Hierarchical Regression last modified March 24, 2022. https://www.sciencing.com/interpret-hierarchical-regression-8554087/