Train an Additive Tree for Regression
addtreenow(x, y, max.depth = 5, alpha = 0, lambda = 1,
lambda.seq = NULL, minobsinnode = 2, minobsinnode.lin = 10,
learning.rate = 1, part.minsplit = 2, part.xval = 0,
part.max.depth = 1, part.cp = 0, part.minbucket = 5,
init = mean(y), lin.type = c("glmnet", "cv.glmnet", "lm.ridge",
"glm"), cv.glmnet.nfolds = 5, cv.glmnet.lambda = "lambda.min",
verbose = FALSE, trace = 0, n.cores = rtCores, ...)
data.frame
Numeric vector of outcome, i.e. dependent variable
Integer: Max depth of additive tree
Float: lambda parameter for MASS::lm.ridge
Default = .01
Integer: Minimum N observations needed in node, before considering splitting
Integer: Max depth for each tree model within the additive tree
Initial value. Default = mean(y)
Logical: If TRUE, print summary to screen.
Integer: If higher than 0, will print more information to the console. Default = 0
Note that lambda is treated differently by glmnet::glmnet
and MASS::lm.ridge