Definition

In a multiple linear regression model, adjusted R square measures the proportion of the variation in the dependent variable accounted for by the explanatory variables. Unlike R square, adjusted R square allows for the degrees of freedom associated with the sums of the squares. Therefore, even though the residual sum of squares decreases or remains the same as new explanatory variables are added, the residual variance does not. For this reason, adjusted R square is generally considered to be a more accurate goodness-of-fit measure than R square.

Points to note when using adjusted R square:

- If adjusted R square is significantly lower than R square, this normally means that some explanatory variable(s) are missing. Without them, the variation in the dependent variable is not fully measured.
- When comparing two models with this measure, make sure you use the same dependent variable.