Learn R Programming

caret (version 3.16)

train: Fit Predictive Models over Different Tuning Parameters

Description

This function sets up a grid of tuning parameters for a number of classification and regression routines, fits each model and calculates a resampling based performance measure.

Usage

train(x, ...)

## S3 method for class 'default': train(x, y, method = "rf", ..., metric = ifelse(is.factor(y), "Accuracy", "RMSE"), trControl = trainControl(), tuneGrid = NULL, tuneLength = 3)

Arguments

x
a data frame containing training data where samples are in rows and features are in columns.
y
a numeric or factor vector containing the outcome for each sample.
method
a string specifying which classification or regression model to use. Possible values are: lm, rda, lda, gbm, rf, nnet, multinom, gpls, lvq
...
arguments passed to the classification or regression routine (such as randomForest). Errors will occur if values for tuning parameters are passed here.
metric
a string that specifies what summary metric will be used to select the optimal model. Possible values are "RMSE" and "Rsquared" for regression and "Accuracy" and "Kappa" for classification.(NOTE: If given, this argument must be named.)
trControl
a list of values that define how this function acts. See trainControl. (NOTE: If given, this argument must be named.)
tuneGrid
a data frame with possible tuning values. The columns are named the same as the tuning parameters in each method preceded by a period (e.g. .decay, .lambda). See the function createGrid in this
tuneLength
an integer denoting the number of levels for each tuning parameters that should be generated by createGrid. (NOTE: If given, this argument must be named.)

Value

  • A list is returned of class train containing:
  • modelTypean identifier of the model type.
  • resultsa data frame the training error rate and values of the tuning parameters.
  • callthe (matched) function call with dots expanded
  • dotsa list containing any ... values passed to the original call
  • metrica string that specifies what summary metric will be used to select the optimal model.
  • trControlthe list of control parameters.
  • finalModelan fit object using the best parameters
  • trainingDataa data frame
  • resampleA data frame with columns for each performance metric. Each row corresponds to each resample. If leave-group-out cross-validation or out-of-bag estimation methods are requested, this will be NULL

Details

train can be used to tune models by picking the complexity parameters that are associated with the optimal resampling statistics. For particular model, a grid of parameters (if any) is created and the model is trained on slightly different data for each candidate combination of tuning parameters. Across each data set, the performance of held-out samples is calculated and the mean and standard deviation is summarized for each combination. The combination with the optimal resampling statistic is chosen as the final model and the entire training set is used to fit a final model.

Currently, the train function does not support model specification via a formula. It assumes that all of the predictors are numeric (perhaps generated by model.matrix).

A variety of models are currently available. The table below enumerates the models and the values of the method argument, as well as the complexity parameters used by train.

lccc{ Model method Value Package Tuning Parameter(s) Recursive partitioning rpart rpart maxdepth ctree party mincriterion Boosted Trees gbm gbm interaction depth, n.trees, shrinkage blackboost mboost maxdepth, mstop ada ada maxdepth, iter, nu Boosted regression models glmboost mboost mstop gamboost mboost mstop Random forests rf randomForest mtry cforest party mtry Bagged Trees treebag ipred None Neural networks nnet nnet decay, size Partial least squares pls pls, caret ncomp Support Vector Machines (RBF) svmradial kernlab sigma, C Support Vector Machines (polynomial) svmpoly kernlab scale, degree, C Linear least squares lm stats None Multivariate adaptive regression splines earth earth degree, nk Bagged MARS bagEarth caret, earth degree, nk Elastic Net enet elasticnet lambda, fraction The Lasso enet elasticnet fraction Linear discriminant analysis lda MASS None Logistic/multinomial regression multinom nnet decay Regularized discriminant analysis rda klaR lambda, gamma Flexible discriminant analysis (MARS) fda mda, earth degree, nk Bagged FDA bagFDA caret, earth degree, nk k nearest neighbors knn3 caret k Nearest shrunken centroids pam pamr threshold Naive Bayes nb klaR usekernel Generalized partial least squares gpls gpls K.prov Learned vector quantization lvq class k }

By default, the function createGrid is used to define the candidate values of the tuning parameters. The user can also specify their own. To do this, a data fame is created with columns for each tuning parameter in the model. The column names must be the same as those listed in the table above with a leading dot. For example, ncomp would have the column heading .ncomp. This data frame can then be passed to createGrid.

In some cases, models may require control arguments. These can be passed via the three dots argument. Note that some models can specify tuning parameters in the control objects. If specified, these values will be superseded by those given in the createGrid argument.

The vignette entitled "caret Manual -- Model Building" has more details and examples related to this function.

See Also

createGrid, createFolds

Examples

Run this code
data(iris)
TrainData <- iris[,1:4]
TrainClasses <- iris[,5]

knnFit1 <- train(TrainData, TrainClasses, "knn", tuneLength = 10,
   trControl = trainControl(method = "cv"))

knnFit2 <- train(TrainData, TrainClasses, "knn", tuneLength = 10, 
   trControl = trainControl(method = "boot"))


library(MASS)
nnetFit <- train(TrainData, TrainClasses, "nnet", 
   tuneLength = 2, trace = FALSE, maxit = 100)

library(mlbench)
data(BostonHousing)

# for illustration, converting factors
trainX <- model.matrix(medv ~ . - 1, BostonHousing)

lmFit <- train(trainX, BostonHousing$medv, "lm")

library(rpart)
rpartFit <- train(trainX, BostonHousing$medv, "rpart", tuneLength = 9)

Run the code above in your browser using DataLab