Cross-validated estimation of the empirical multi-class loss, can be used for tuning parameter selection.
cv.rmbst(x, y, balance=FALSE, K = 10, cost = NULL, rfamily = c("thinge", "closs"),
learner = c("tree", "ls", "sm"), ctrl = bst_control(), type = c("loss","error"),
plot.it = TRUE, main = NULL, se = TRUE, n.cores=2, ...)
object with
empirical risks in each cross-validation at boosting iterations
abscissa values at which CV curve should be computed.
The CV curve at each value of fraction
The standard error of the CV curve
...
a data frame containing the variables in the model.
vector of responses. y
must be integers from 1 to C for C class problem.
logical value. If TRUE, The K parts were roughly balanced, ensuring that the classes were distributed proportionally among each of the K parts.
K-fold cross-validation
price to pay for false positive, 0 < cost
< 1; price of false negative is 1-cost
.
rfamily
= "thinge" for truncated multi-class hinge loss.
Implementing the negative gradient corresponding to the loss function to be minimized.
a character specifying the component-wise base learner to be used:
ls
linear models,
sm
smoothing splines,
tree
regression trees.
an object of class bst_control
.
loss value or misclassification error.
a logical value, to plot the estimated loss or error with cross validation if TRUE
.
title of plot
a logical value, to plot with standard errors.
The number of CPU cores to use. The cross-validation loop will attempt to send different CV folds off to different cores.
additional arguments.
Zhu Wang
rmbst