Learn R Programming

MachineShop (version 3.8.0)

GAMBoostModel: Gradient Boosting with Additive Models

Description

Gradient boosting for optimizing arbitrary loss functions, where component-wise arbitrary base-learners, e.g., smoothing procedures, are utilized as additive base-learners.

Usage

GAMBoostModel(
  family = NULL,
  baselearner = c("bbs", "bols", "btree", "bss", "bns"),
  dfbase = 4,
  mstop = 100,
  nu = 0.1,
  risk = c("inbag", "oobag", "none"),
  stopintern = FALSE,
  trace = FALSE
)

Value

MLModel class object.

Arguments

family

optional Family object. Set automatically according to the class type of the response variable.

baselearner

character specifying the component-wise base learner to be used.

dfbase

gobal degrees of freedom for P-spline base learners ("bbs").

mstop

number of initial boosting iterations.

nu

step size or shrinkage parameter between 0 and 1.

risk

method to use in computing the empirical risk for each boosting iteration.

stopintern

logical inidicating whether the boosting algorithm stops internally when the out-of-bag risk increases at a subsequent iteration.

trace

logical indicating whether status information is printed during the fitting process.

Details

Response types:

binary factor, BinomialVariate, NegBinomialVariate, numeric, PoissonVariate, Surv

Automatic tuning of grid parameter:

mstop

Default argument values and further model details can be found in the source See Also links below.

See Also

gamboost, Family, baselearners, fit, resample

Examples

Run this code
# \donttest{
## Requires prior installation of suggested package mboost to run

data(Pima.tr, package = "MASS")

fit(type ~ ., data = Pima.tr, model = GAMBoostModel)
# }

Run the code above in your browser using DataLab