Cross-validated estimation of the empirical misclassification error for boosting parameter selection.
cv.mhingeova(x, y, balance=FALSE, K=10, cost = NULL, nu=0.1,
learner=c("tree", "ls", "sm"), maxdepth=1, m1=200, twinboost = FALSE,
m2=200, trace=FALSE, plot.it = TRUE, se = TRUE, ...)
object with
empirical risks in each cross-validation at boosting iterations
abscissa values at which CV curve should be computed.
The CV curve at each value of fraction
The standard error of the CV curve
...
a data frame containing the variables in the model.
vector of multi class responses. y
must be an integer vector from 1 to C for C class problem.
logical value. If TRUE, The K parts were roughly balanced, ensuring that the classes were distributed proportionally among each of the K parts.
K-fold cross-validation
price to pay for false positive, 0 < cost
< 1; price of false negative is 1-cost
.
a small number (between 0 and 1) defining the step size or shrinkage parameter.
a character specifying the component-wise base learner to be used:
ls
linear models,
sm
smoothing splines,
tree
regression trees.
tree depth used in learner=tree
number of boosting iteration
logical: twin boosting?
number of twin boosting iteration
if TRUE, iteration results printed out
a logical value, to plot the estimated risks if TRUE
.
a logical value, to plot with standard errors.
additional arguments.
mhingeova