Learn R Programming

sjPlot (version 2.0.0)

sjp.poly: Plot polynomials for (generalized) linear regression

Description

This function plots a scatter plot of a term poly.term against a response variable x and adds - depending on the amount of numeric values in poly.degree - multiple polynomial curves. A loess-smoothed line can be added to see which of the polynomial curves fits best to the data.

Usage

sjp.poly(x, poly.term, poly.degree, poly.scale = FALSE, fun = NULL,
  axis.title = NULL, scatter.plot = TRUE, show.loess = TRUE,
  show.loess.ci = TRUE, show.p = TRUE, geom.colors = NULL,
  geom.size = 0.8, loessLineColor = "#808080", point.color = "#404040",
  point.alpha = 0.2, prnt.plot = TRUE)

Arguments

Value

(insisibily) returns [object Object],[object Object],[object Object]

Details

For each polynomial degree, a simple linear regression on x (resp. the extracted response, if x is a fitted model) is performed, where only the polynomial term poly.term is included as independent variable. Thus, lm(y ~ x + I(x^2) + ... + I(x^i)) is repeatedly computed for all values in poly.degree, and the predicted values of the reponse are plotted against the raw values of poly.term. If x is a fitted model, other covariates are ignored when finding the best fitting polynomial. This function evaluates raw polynomials, not orthogonal polynomials. Polynomials are computed using the poly function, with argument raw = TRUE. To find out which polynomial degree fits best to the data, a loess-smoothed line (in dark grey) can be added (with show.loess = TRUE). The polynomial curves that comes closest to the loess-smoothed line should be the best fit to the data.

See Also

To plot marginal effects of polynomial terms, call sjp.lm with type = "poly", or sjp.lmer respectively for linear mixed models.

Examples

Run this code
library(sjmisc)
data(efc)
# linear fit. loess-smoothed line indicates a more
# or less cubic curve
sjp.poly(efc$c160age, efc$quol_5, 1)

# quadratic fit
sjp.poly(efc$c160age, efc$quol_5, 2)

# linear to cubic fit
sjp.poly(efc$c160age, efc$quol_5, 1:4, scatter.plot = FALSE)

library(sjmisc)
data(efc)
# fit sample model
fit <- lm(tot_sc_e ~ c12hour + e17age + e42dep, data = efc)
# inspect relationship between predictors and response
sjp.lm(fit, type = "slope", show.loess = TRUE, scatter.plot = FALSE)
# "e17age" does not seem to be linear correlated to response
# try to find appropiate polynomial. Grey line (loess smoothed)
# indicates best fit. Looks like x^4 has the best fit,
# however, only x^3 has significant p-values.
sjp.poly(fit, "e17age", 2:4, scatter.plot = FALSE)

# fit new model
fit <- lm(tot_sc_e ~ c12hour + e42dep + e17age + I(e17age^2) + I(e17age^3),
          data = efc)
# plot marginal effects of polynomial term
sjp.lm(fit, type = "poly", poly.term = "e17age")

Run the code above in your browser using DataLab