Boosted additive trees. This is lower-level than s.*
functions
addtboost(x, y, x.valid = NULL, y.valid = NULL, resid = NULL,
boost.obj = NULL, mod.params = list(), case.p = 1,
learning.rate = 0.1, max.iter = 10, init = mean(y),
cxrcoef = FALSE, print.progress.every = 5,
print.error.plot = "final", base.verbose = FALSE, verbose = TRUE,
trace = 0, prefix = NULL, print.plot = TRUE,
plot.theme = "darkgrid", plot.type = "l", ...)
Data frame: Input features
Vector: Output
Named list of arguments for mod
Float (0, 1] Learning rate for the additive steps
Integer: Maximum number of iterations (additive steps) to perform. Default = 10
Float: Initial value for prediction. Default = mean(y)
Logical: If TRUE, pass cxr = TRUE, cxrcoef = TRUE
to predict.addTreeRaw
String or Integer: "final" plots a training and validation (if available) error curve at the end of training. If integer, plot training and validation error curve every this many iterations during training for each base learner
Logical: verbose
argument passed to learner
Logical: If TRUE, print summary to screen.
Integer: If > 0, print diagnostic info to console
Logical: if TRUE, produce plot using mplot3
Takes precedence over plot.fitted
and plot.predicted
String: "zero", "dark", "box", "darkbox"
Additional parameters to be passed to learner
Algorithm to boost, for options, see modSelect
Float: If training error <= this value, training stops
Float: If validation error <= this value, training stops
addtboost
object