powered by
Performs the usual k-fold cross-validation procedure on a given data set, parameter grid and learner.
CV(data, learner, params, fold = 5, verbose = TRUE)
The data set as CVST.data object.
CVST.data
The learner as CVST.learner object.
CVST.learner
the parameter grid as CVST.params object.
CVST.params
The number of folds that should be generated for each set of parameters.
Should the procedure report the performance for each model?
Returns the optimal parameter settings as determined by k-fold cross-validation.
M. Stone. Cross-validatory choice and assessment of statistical predictions. Journal of the Royal Statistical Society. Series B, 36(2):111--147, 1974.
Sylvain Arlot, Alain Celisse, and Paul Painleve. A survey of cross-validation procedures for model selection. Statistics Surveys, 4:40--79, 2010.
fastCV constructData constructLearner constructParams
fastCV
constructData
constructLearner
constructParams
# NOT RUN { ns = noisySine(100) svm = constructSVMLearner() params = constructParams(kernel="rbfdot", sigma=10^(-3:3), nu=c(0.05, 0.1, 0.2, 0.3)) opt = CV(ns, svm, params) # }
Run the code above in your browser using DataLab