train(x, ...)## S3 method for class 'default':
train(x, y, method = "rf", ...,
metric = ifelse(is.factor(y), "Accuracy", "RMSE"),
trControl = trainControl(), tuneGrid = NULL,
tuneLength = 3)
lm
, rda
, lda
, gbm
, rf
, nnet
, multinom
, gpls
, lvq
randomForest
). Errors will occur if values
for tuning parameters are passed here.trainControl
. (NOTE: If given, this argument must be named.)createGrid
in this createGrid
. (NOTE: If given, this argument must be named.)train
containing:NULL
train
can be used to tune models by picking the complexity parameters that are associated with the optimal resampling statistics. For particular model, a grid of parameters (if any) is created and the model is trained on slightly different data for each candidate combination of tuning parameters. Across each data set, the performance of held-out samples is calculated and the mean and standard deviation is summarized for each combination. The combination with the optimal resampling statistic is chosen as the final model and the entire training set is used to fit a final model.Currently, the train
function does not support model specification via a formula. It assumes that all of the predictors are numeric (perhaps generated by model.matrix
).
A variety of models are currently available. The table below enumerates the models and the values of the method
argument, as well as the complexity parameters used by train
.
method
Value Package Tuning Parameter(s)
Recursive partitioning rpart
maxdepth
ctree
mincriterion
Boosted Trees gbm
interaction depth
,
n.trees
, shrinkage
blackboost
maxdepth
, mstop
ada
maxdepth
, iter
, nu
Boosted regression models glmboost
mstop
gamboost
mstop
Random forests rf
mtry
cforest
mtry
Bagged Trees treebag
nnet
decay
, size
Partial least squares pls
ncomp
Support Vector Machines (RBF) svmradial
sigma
, C
Support Vector Machines (polynomial) svmpoly
scale
, degree
, C
Linear least squares lm
earth
degree
, nprune
Bagged MARS bagEarth
degree
, nprune
Elastic Net enet
lambda
, fraction
The Lasso enet
fraction
Linear discriminant analysis lda
sddaLDA
, sddaQDA
multinom
decay
Regularized discriminant analysis rda
lambda
, gamma
Flexible discriminant analysis (MARS) fda
degree
, nprune
Bagged FDA bagFDA
degree
, nprune
k nearest neighbors knn3
k
Nearest shrunken centroids pam
threshold
Naive Bayes nb
usekernel
Generalized partial least squares gpls
K.prov
Learned vector quantization lvq
k
}
By default, the function createGrid
is used to define the candidate values of the tuning parameters. The user can also specify their own. To do this, a data fame is created with columns for each tuning parameter in the model. The column names must be the same as those listed in the table above with a leading dot. For example, ncomp
would have the column heading .ncomp
. This data frame can then be passed to createGrid
.
In some cases, models may require control arguments. These can be passed via the three dots argument. Note that some models can specify tuning parameters in the control objects. If specified, these values will be superseded by those given in the createGrid
argument.
The vignette entitled "caret Manual -- Model Building" has more details and examples related to this function.
createGrid
, createFolds
data(iris)
TrainData <- iris[,1:4]
TrainClasses <- iris[,5]
knnFit1 <- train(TrainData, TrainClasses, "knn", tuneLength = 10,
trControl = trainControl(method = "cv"))
knnFit2 <- train(TrainData, TrainClasses, "knn", tuneLength = 10,
trControl = trainControl(method = "boot"))
library(MASS)
nnetFit <- train(TrainData, TrainClasses, "nnet",
tuneLength = 2, trace = FALSE, maxit = 100)
library(mlbench)
data(BostonHousing)
# for illustration, converting factors
trainX <- model.matrix(medv ~ . - 1, BostonHousing)
lmFit <- train(trainX, BostonHousing$medv, "lm")
library(rpart)
rpartFit <- train(trainX, BostonHousing$medv, "rpart", tuneLength = 9)
Run the code above in your browser using DataLab