Learn R Programming

olsrr (version 0.5.3)

ols_coll_diag: Collinearity diagnostics

Description

Variance inflation factor, tolerance, eigenvalues and condition indices.

Usage

ols_coll_diag(model)

ols_vif_tol(model)

ols_eigen_cindex(model)

Value

ols_coll_diag returns an object of class "ols_coll_diag". An object of class "ols_coll_diag" is a list containing the following components:

vif_t

tolerance and variance inflation factors

eig_cindex

eigen values and condition index

Arguments

model

An object of class lm.

Details

Collinearity implies two variables are near perfect linear combinations of one another. Multicollinearity involves more than two variables. In the presence of multicollinearity, regression estimates are unstable and have high standard errors.

Tolerance

Percent of variance in the predictor that cannot be accounted for by other predictors.

Steps to calculate tolerance:

  • Regress the kth predictor on rest of the predictors in the model.

  • Compute \(R^2\) - the coefficient of determination from the regression in the above step.

  • \(Tolerance = 1 - R^2\)

Variance Inflation Factor

Variance inflation factors measure the inflation in the variances of the parameter estimates due to collinearities that exist among the predictors. It is a measure of how much the variance of the estimated regression coefficient \(\beta_k\) is inflated by the existence of correlation among the predictor variables in the model. A VIF of 1 means that there is no correlation among the kth predictor and the remaining predictor variables, and hence the variance of \(\beta_k\) is not inflated at all. The general rule of thumb is that VIFs exceeding 4 warrant further investigation, while VIFs exceeding 10 are signs of serious multicollinearity requiring correction.

Steps to calculate VIF:

  • Regress the kth predictor on rest of the predictors in the model.

  • Compute \(R^2\) - the coefficient of determination from the regression in the above step.

  • \(Tolerance = 1 / 1 - R^2 = 1 / Tolerance\)

Condition Index

Most multivariate statistical approaches involve decomposing a correlation matrix into linear combinations of variables. The linear combinations are chosen so that the first combination has the largest possible variance (subject to some restrictions), the second combination has the next largest variance, subject to being uncorrelated with the first, the third has the largest possible variance, subject to being uncorrelated with the first and second, and so forth. The variance of each of these linear combinations is called an eigenvalue. Collinearity is spotted by finding 2 or more variables that have large proportions of variance (.50 or more) that correspond to large condition indices. A rule of thumb is to label as large those condition indices in the range of 30 or larger.

References

Belsley, D. A., Kuh, E., and Welsch, R. E. (1980). Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. New York: John Wiley & Sons.

Examples

Run this code
# model
model <- lm(mpg ~ disp + hp + wt + drat, data = mtcars)

# vif and tolerance
ols_vif_tol(model)

# eigenvalues and condition indices
ols_eigen_cindex(model)

# collinearity diagnostics
ols_coll_diag(model)

Run the code above in your browser using DataLab