powered by
Gradient boosting for optimizing arbitrary loss functions where component-wise linear models are utilized as base-learners.
GLMBoostModel( family = NULL, mstop = 100, nu = 0.1, risk = c("inbag", "oobag", "none"), stopintern = FALSE, trace = FALSE )
MLModel class object.
MLModel
optional Family object. Set automatically according to the class type of the response variable.
Family
number of initial boosting iterations.
step size or shrinkage parameter between 0 and 1.
method to use in computing the empirical risk for each boosting iteration.
logical inidicating whether the boosting algorithm stops internally when the out-of-bag risk increases at a subsequent iteration.
logical indicating whether status information is printed during the fitting process.
binary factor, BinomialVariate, NegBinomialVariate, numeric, PoissonVariate, Surv
binary factor
BinomialVariate
NegBinomialVariate
numeric
PoissonVariate
Surv
mstop
Default argument values and further model details can be found in the source See Also links below.
glmboost, Family, fit, resample
glmboost
fit
resample
# \donttest{ ## Requires prior installation of suggested package mboost to run data(Pima.tr, package = "MASS") fit(type ~ ., data = Pima.tr, model = GLMBoostModel) # }
Run the code above in your browser using DataLab