Learn R Programming

caret (version 4.69)

train: Fit Predictive Models over Different Tuning Parameters

Description

This function sets up a grid of tuning parameters for a number of classification and regression routines, fits each model and calculates a resampling based performance measure.

Usage

train(x, ...)

## S3 method for class 'default': train(x, y, method = "rf", preProcess = NULL, ..., weights = NULL, metric = ifelse(is.factor(y), "Accuracy", "RMSE"), maximize = ifelse(metric == "RMSE", FALSE, TRUE), trControl = trainControl(), tuneGrid = NULL, tuneLength = 3)

## S3 method for class 'formula': train(form, data, ..., weights, subset, na.action, contrasts = NULL)

Arguments

x
a data frame containing training data where samples are in rows and features are in columns.
y
a numeric or factor vector containing the outcome for each sample.
form
A formula of the form y ~ x1 + x2 + ...
data
Data frame from which variables specified in formula are preferentially to be taken.
weights
a numeric vector of case weights. This argument will only affect models that allow case weights.
subset
An index vector specifying the cases to be used in the training sample. (NOTE: If given, this argument must be named.)
na.action
A function to specify the action to be taken if NAs are found. The default action is for the procedure to fail. An alternative is na.omit, which leads to rejection of cases with missing values on any required variable. (NOTE: If given, this argument must
contrasts
a list of contrasts to be used for some or all of the factors appearing as variables in the model formula.
method
a string specifying which classification or regression model to use. Possible values are: ada, bag, bagEarth, bagFDA, blackboost, cforest, ctree, ctree2
...
arguments passed to the classification or regression routine (such as randomForest). Errors will occur if values for tuning parameters are passed here.
preProcess
a string vector that defines an pre-processing of the predictor data. Current possibilities are center, scale, spatialSign, pca, ica, and knnImpute. See
metric
a string that specifies what summary metric will be used to select the optimal model. By default, possible values are "RMSE" and "Rsquared" for regression and "Accuracy" and "Kappa" for classification. If custom performance metrics are used (via the
maximize
a logical: should the metric be maximized or minimized?
trControl
a list of values that define how this function acts. See trainControl. (NOTE: If given, this argument must be named.)
tuneGrid
a data frame with possible tuning values. The columns are named the same as the tuning parameters in each method preceded by a period (e.g. .decay, .lambda). Also, a function can be passed to tuneGrid with arguments called len<
tuneLength
an integer denoting the number of levels for each tuning parameters that should be generated by createGrid. (NOTE: If given, this argument must be named.)

Value

  • A list is returned of class train containing:
  • modelTypean identifier of the model type.
  • resultsa data frame the training error rate and values of the tuning parameters.
  • callthe (matched) function call with dots expanded
  • dotsa list containing any ... values passed to the original call
  • metrica string that specifies what summary metric will be used to select the optimal model.
  • trControlthe list of control parameters.
  • finalModelan fit object using the best parameters
  • trainingDataa data frame
  • resampleA data frame with columns for each performance metric. Each row corresponds to each resample. If leave-one-out cross-validation or out-of-bag estimation methods are requested, this will be NULL. The returnResamp argument of trainControl controls how much of the resampled results are saved.
  • perfNamesa character vector of performance metrics that are produced by the summary function
  • maximizea logical recycled from the function arguments.

Details

train can be used to tune models by picking the complexity parameters that are associated with the optimal resampling statistics. For particular model, a grid of parameters (if any) is created and the model is trained on slightly different data for each candidate combination of tuning parameters. Across each data set, the performance of held-out samples is calculated and the mean and standard deviation is summarized for each combination. The combination with the optimal resampling statistic is chosen as the final model and the entire training set is used to fit a final model.

A variety of models are currently available. The table below enumerates the models and the values of the method argument, as well as the complexity parameters used by train.

lccc{ Model method Value Package Tuning Parameter(s) Generalized linear model glm stats none glmStepAIC MASS none Generalized additive model gam mgcv select, method gamLoess gam span, degree gamSpline gam df Recursive partitioning rpart rpart maxdepth ctree party mincriterion ctree2 party maxdepth Boosted trees gbm gbm interaction depth, n.trees, shrinkage blackboost mboost maxdepth, mstop ada ada maxdepth, iter, nu Boosted regression models glmboost mboost mstop gamboost mboost mstop logitBoost caTools nIter Random forests rf randomForest mtry parRF randomForest, foreach mtry cforest party mtry Bagging treebag ipred None bag caret vars logicBag logicFS ntrees, nleaves Other Trees nodeHarvest nodeHarvest maxinter, node partDSA partDSA cut.off.growth, MPD Logic Regression logreg LogicReg ntrees, code{treesize} Elastic net (glm) glmnet glmnet alpha, lambda Neural networks nnet nnet decay, size neuralnet neuralnet layer1, layer2, layer3 pcaNNet caret decay, size Projection pursuit regression ppr stats nterms Principal component regression pcr pls ncomp Independent component regression icr caret n.comp Partial least squares pls pls, caret ncomp Sparse partial least squares spls spls, caret K, eta, kappa Support vector machines svmLinear kernlab C svmRadial kernlab sigma, C svmPoly kernlab scale, degree, C Relevance vector machines rvmLinear kernlab none rvmRadial kernlab sigma rvmPoly kernlab scale, degree Least squares support vector machines lssvmRadial kernlab sigma Gaussian processes guassprLinearl kernlab none guassprRadial kernlab sigma guassprPoly kernlab scale, degree Linear least squares lm stats None lmStepAIC MASS None Robust linear regression rlm MASS None Multivariate adaptive regression splines earth earth degree, nprune Bagged MARS bagEarth caret, earth degree, nprune Rule Based Regression M5Rules RWeka pruned Penalized linear models penalized penalized lambda1, lambda2 enet elasticnet lambda, fraction lars lars fraction lars2 lars steps enet elasticnet fraction foba foba lambda, k Supervised principal components superpc superpc n.components, threshold Quantile Regression Forests qrf quantregForest mtry Linear discriminant analysis lda MASS None Linda rrcov None Quadratic discriminant analysis qda MASS None QdaCov rrcov None Stabilized linear discriminant analysis slda ipred None Heteroscedastic discriminant analysis hda hda newdim, lambda, gamma Stepwise discriminant analysis stepLDA klaR maxvar, direction stepQDA klaR maxvar, direction Stepwise diagonal discriminant analysis sddaLDA SDDA None sddaQDA SDDA None Shrinkage discriminant analysis sda sda diagonal Sparse linear discriminant analysis sparseLDA sparseLDA NumVars, lambda Regularized discriminant analysis rda klaR lambda, gamma Mixture discriminant analysis mda mda subclasses Sparse mixture discriminant analysis smda sparseLDA NumVars, R, lambda Penalized discriminant analysis pda mda lambda pda2 mda df Stabilised linear discriminant analysis slda ipred None High dimensional discriminant analysis hdda HDclassif model, threshold Flexible discriminant analysis (MARS) fda mda, earth degree, nprune Bagged FDA bagFDA caret, earth degree, nprune Logistic/multinomial regression multinom nnet decay Penalized logistic regression plr stepPlr lambda, cp Rule--based classification J48 RWeka C OneR RWeka None PART RWeka threshold, pruned JRip RWeka NumOpt Logic Forests logforest LogicForest None Bayesian multinomial probit model vbmpRadial vbmp estimateTheta k nearest neighbors knn3 caret k Nearest shrunken centroids pam pamr threshold scrda rda alpha, delta Naive Bayes nb klaR usekernel Generalized partial least squares gpls gpls K.prov Learned vector quantization lvq class size, k ROC Curves rocc rocc xgenes }

By default, the function createGrid is used to define the candidate values of the tuning parameters. The user can also specify their own. To do this, a data fame is created with columns for each tuning parameter in the model. The column names must be the same as those listed in the table above with a leading dot. For example, ncomp would have the column heading .ncomp. This data frame can then be passed to createGrid.

In some cases, models may require control arguments. These can be passed via the three dots argument. Note that some models can specify tuning parameters in the control objects. If specified, these values will be superseded by those given in the createGrid argument.

The vignette entitled "caret Manual -- Model Building" has more details and examples related to this function.

train can be used with "explicit parallelism", where different resamples (e.g. cross-validation group) can be split up and run on multiple machines or processors. By default, train will use a single processor on the host machine. To use more, the computeFunction and computeArgs arguments in trainControl can be used. computeFunction is used to pass a function that takes arguments named X and FUN. Internally, train will pass the data and modeling functions through using these arguments. By default, train uses lapply. Alternatively, any function that emulates lapply but distributes jobs across multiple machines/processors can be used. Arguments to such a function can be passed (if needed) via the computeArgs argument in trainControl. Examples are given below using the Rmpi package (via snow) and NetworkSpaces (via the nws package).

References

Kuhn (2008), ``Building Predictive Models in R Using the caret'' (http://www.jstatsoft.org/v28/i05/)

See Also

trainControl, createGrid, createFolds

Examples

Run this code
#######################################
## Classification Example

data(iris)
TrainData <- iris[,1:4]
TrainClasses <- iris[,5]

knnFit1 <- train(TrainData, TrainClasses,
                 method = "knn",
                 preProcess = c("center", "scale"),
                 tuneLength = 10,
                 trControl = trainControl(method = "cv"))

knnFit2 <- train(TrainData, TrainClasses,
                 method = "knn",
                 preProcess = c("center", "scale"),
                 tuneLength = 10, 
                 trControl = trainControl(method = "boot"))


library(MASS)
nnetFit <- train(TrainData, TrainClasses,
                 method = "nnet",
                 preProcess = c("center", "scale"), 
                 tuneLength = 2,
                 trace = FALSE,
                 maxit = 100)

#######################################
## Regression Example

library(mlbench)
data(BostonHousing)

lmFit <- train(medv ~ . + rm:lstat,
               data = BostonHousing, 
               "lm")

library(rpart)
rpartFit <- train(medv ~ .,
                  data = BostonHousing,
                  "rpart",
                  tuneLength = 9)

#######################################
## Example with a custom metric

madSummary <- function (data,
                        lev = NULL,
                        model = NULL) 
{
  out <- mad(data$obs - data$pred, 
             na.rm = TRUE)  
  names(out) <- "MAD"
  out
}

robustControl <- trainControl(summaryFunction = madSummary)
marsGrid <- expand.grid(.degree = 1,
                        .nprune = (1:10) * 2)

earthFit <- train(medv ~ .,
                  data = BostonHousing, 
                  "earth",
                  tuneGrid = marsGrid,
                  metric = "MAD",
                  maximize = FALSE,
                  trControl = robustControl)

#######################################
## Parallel Processing Example via MPI

## A function to emulate lapply in parallel
mpiCalcs <- function(X, FUN, ...)
  {
    theDots <- list(...)
    parLapply(theDots$cl, X, FUN)
  }

library(snow)
cl <- makeCluster(5, "MPI")

## 50 bootstrap models distributed across 5 workers
mpiControl <- trainControl(workers = 5,
                           number = 50,
                           computeFunction = mpiCalcs,
                           computeArgs = list(cl = cl))
set.seed(1)
usingMPI <-  train(medv ~ .,
                   data = BostonHousing, 
                   "glmboost",
                   trControl = mpiControl)

################################################
## Parallel Random Forest using foreach and doMPI

library(doMPI)
cl <- startMPIcluster(count = 5, verbose = TRUE)
registerDoMPI(cl)

rfMPI <- train(medv ~ .,
               data = BostonHousing, 
               "parRF")

closeCluster(cl)

#######################################
## Parallel Processing Example via NWS
nwsCalcs <- function(X, FUN, ...)
  {
    theDots <- list(...)
    eachElem(theDots$sObj,
             fun = FUN,
             elementArgs = list(X))
  }

library(nws)
sObj <- sleigh(workerCount = 5)

nwsControl <- trainControl(workers = 5,
                           number = 50,
                           computeFunction = nwsCalcs,
                           computeArgs = list(sObj = sObj))
set.seed(1)
usingNWS <-  train(medv ~ .,
                   data = BostonHousing, 
                   "glmboost",
                   trControl = nwsControl)

close(sObj)


#######################################
## Parallel Random Forest Models using
## the foreach package and MPI

library(doMPI)
cl <- startMPIcluster(2)
registerDoMPI(cl)

set.seed(1)
parallelRF <-  train(medv ~ .,
                     data = BostonHousing, 
                     "parRF")
closeCluster(cl)

Run the code above in your browser using DataLab