Learn R Programming

MachineShop (version 2.8.0)

GLMBoostModel: Gradient Boosting with Linear Models

Description

Gradient boosting for optimizing arbitrary loss functions where component-wise linear models are utilized as base-learners.

Usage

GLMBoostModel(
  family = NULL,
  mstop = 100,
  nu = 0.1,
  risk = c("inbag", "oobag", "none"),
  stopintern = FALSE,
  trace = FALSE
)

Arguments

family

optional Family object. Set automatically according to the class type of the response variable.

mstop

number of initial boosting iterations.

nu

step size or shrinkage parameter between 0 and 1.

risk

method to use in computing the empirical risk for each boosting iteration.

stopintern

logical inidicating whether the boosting algorithm stops internally when the out-of-bag risk increases at a subsequent iteration.

trace

logical indicating whether status information is printed during the fitting process.

Value

MLModel class object.

Details

Response Types:

binary factor, BinomialVariate, NegBinomialVariate, numeric, PoissonVariate, Surv

Automatic Tuning of Grid Parameters:

mstop

Default values for the NULL arguments and further model details can be found in the source links below.

See Also

glmboost, Family, fit, resample

Examples

Run this code
# NOT RUN {
## Requires prior installation of suggested package mboost to run

data(Pima.tr, package = "MASS")

fit(type ~ ., data = Pima.tr, model = GLMBoostModel)
# }
# NOT RUN {
# }

Run the code above in your browser using DataLab