These models are included in the package via wrappers for train
. Custom models can also be created. See the URL below.
AdaBoost Classification Trees (method = 'adaboost'
)
For classification using package fastAdaboost with tuning parameters:
Number of Trees (nIter
, numeric)
Method (method
, character)
AdaBoost.M1 (method = 'AdaBoost.M1'
)
For classification using packages adabag and plyr with tuning parameters:
Number of Trees (mfinal
, numeric)
Max Tree Depth (maxdepth
, numeric)
Coefficient Type (coeflearn
, character)
Adaptive Mixture Discriminant Analysis (method = 'amdai'
)
For classification using package adaptDA with tuning parameters:
Model Type (model
, character)
Adaptive-Network-Based Fuzzy Inference System (method = 'ANFIS'
)
For regression using package frbs with tuning parameters:
Number of Fuzzy Terms (num.labels
, numeric)
Max. Iterations (max.iter
, numeric)
Adjacent Categories Probability Model for Ordinal Data (method = 'vglmAdjCat'
)
For classification using package VGAM with tuning parameters:
Parallel Curves (parallel
, logical)
Link Function (link
, character)
Bagged AdaBoost (method = 'AdaBag'
)
For classification using packages adabag and plyr with tuning parameters:
Number of Trees (mfinal
, numeric)
Max Tree Depth (maxdepth
, numeric)
Bagged CART (method = 'treebag'
)
For classification and regression using packages ipred, plyr and e1071 with no tuning parameters
Bagged FDA using gCV Pruning (method = 'bagFDAGCV'
)
For classification using package earth with tuning parameters:
Product Degree (degree
, numeric)
Bagged Flexible Discriminant Analysis (method = 'bagFDA'
)
For classification using packages earth and mda with tuning parameters:
Product Degree (degree
, numeric)
Number of Terms (nprune
, numeric)
Bagged Logic Regression (method = 'logicBag'
)
For classification and regression using package logicFS with tuning parameters:
Maximum Number of Leaves (nleaves
, numeric)
Number of Trees (ntrees
, numeric)
Bagged MARS (method = 'bagEarth'
)
For classification and regression using package earth with tuning parameters:
Number of Terms (nprune
, numeric)
Product Degree (degree
, numeric)
Bagged MARS using gCV Pruning (method = 'bagEarthGCV'
)
For classification and regression using package earth with tuning parameters:
Product Degree (degree
, numeric)
Bagged Model (method = 'bag'
)
For classification and regression using package caret with tuning parameters:
Number of Randomly Selected Predictors (vars
, numeric)
Bayesian Additive Regression Trees (method = 'bartMachine'
)
For classification and regression using package bartMachine with tuning parameters:
Number of Trees (num_trees
, numeric)
Prior Boundary (k
, numeric)
Base Terminal Node Hyperparameter (alpha
, numeric)
Power Terminal Node Hyperparameter (beta
, numeric)
Degrees of Freedom (nu
, numeric)
Bayesian Generalized Linear Model (method = 'bayesglm'
)
For classification and regression using package arm with no tuning parameters
Bayesian Regularized Neural Networks (method = 'brnn'
)
For regression using package brnn with tuning parameters:
Number of Neurons (neurons
, numeric)
Bayesian Ridge Regression (method = 'bridge'
)
For regression using package monomvn with no tuning parameters
Bayesian Ridge Regression (Model Averaged) (method = 'blassoAveraged'
)
For regression using package monomvn with no tuning parameters
Binary Discriminant Analysis (method = 'binda'
)
For classification using package binda with tuning parameters:
Shrinkage Intensity (lambda.freqs
, numeric)
Boosted Classification Trees (method = 'ada'
)
For classification using packages ada and plyr with tuning parameters:
Number of Trees (iter
, numeric)
Max Tree Depth (maxdepth
, numeric)
Learning Rate (nu
, numeric)
Boosted Generalized Additive Model (method = 'gamboost'
)
For classification and regression using packages mboost and plyr with tuning parameters:
Number of Boosting Iterations (mstop
, numeric)
AIC Prune? (prune
, character)
Boosted Generalized Linear Model (method = 'glmboost'
)
For classification and regression using packages plyr and mboost with tuning parameters:
Number of Boosting Iterations (mstop
, numeric)
AIC Prune? (prune
, character)
Boosted Linear Model (method = 'BstLm'
)
For classification and regression using packages bst and plyr with tuning parameters:
Number of Boosting Iterations (mstop
, numeric)
Shrinkage (nu
, numeric)
Boosted Logistic Regression (method = 'LogitBoost'
)
For classification using package caTools with tuning parameters:
Number of Boosting Iterations (nIter
, numeric)
Boosted Smoothing Spline (method = 'bstSm'
)
For classification and regression using packages bst and plyr with tuning parameters:
Number of Boosting Iterations (mstop
, numeric)
Shrinkage (nu
, numeric)
Boosted Tree (method = 'blackboost'
)
For classification and regression using packages party, mboost and plyr with tuning parameters:
Number of Trees (mstop
, numeric)
Max Tree Depth (maxdepth
, numeric)
Boosted Tree (method = 'bstTree'
)
For classification and regression using packages bst and plyr with tuning parameters:
Number of Boosting Iterations (mstop
, numeric)
Max Tree Depth (maxdepth
, numeric)
Shrinkage (nu
, numeric)
C4.5-like Trees (method = 'J48'
)
For classification using package RWeka with tuning parameters:
Confidence Threshold (C
, numeric)
Minimum Instances Per Leaf (M
, numeric)
C5.0 (method = 'C5.0'
)
For classification using packages C50 and plyr with tuning parameters:
Number of Boosting Iterations (trials
, numeric)
Model Type (model
, character)
Winnow (winnow
, logical)
CART (method = 'rpart'
)
For classification and regression using package rpart with tuning parameters:
Complexity Parameter (cp
, numeric)
CART (method = 'rpart1SE'
)
For classification and regression using package rpart with no tuning parameters
CART (method = 'rpart2'
)
For classification and regression using package rpart with tuning parameters:
Max Tree Depth (maxdepth
, numeric)
CART or Ordinal Responses (method = 'rpartScore'
)
For classification using packages rpartScore and plyr with tuning parameters:
Complexity Parameter (cp
, numeric)
Split Function (split
, character)
Pruning Measure (prune
, character)
CHi-squared Automated Interaction Detection (method = 'chaid'
)
For classification using package CHAID with tuning parameters:
Merging Threshold (alpha2
, numeric)
Splitting former Merged Threshold (alpha3
, numeric)
Splitting former Merged Threshold (alpha4
, numeric)
Conditional Inference Random Forest (method = 'cforest'
)
For classification and regression using package party with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Conditional Inference Tree (method = 'ctree'
)
For classification and regression using package party with tuning parameters:
1 - P-Value Threshold (mincriterion
, numeric)
Conditional Inference Tree (method = 'ctree2'
)
For classification and regression using package party with tuning parameters:
Max Tree Depth (maxdepth
, numeric)
1 - P-Value Threshold (mincriterion
, numeric)
Continuation Ratio Model for Ordinal Data (method = 'vglmContRatio'
)
For classification using package VGAM with tuning parameters:
Parallel Curves (parallel
, logical)
Link Function (link
, character)
Cost-Sensitive C5.0 (method = 'C5.0Cost'
)
For classification using packages C50 and plyr with tuning parameters:
Number of Boosting Iterations (trials
, numeric)
Model Type (model
, character)
Winnow (winnow
, logical)
Cost (cost
, numeric)
Cost-Sensitive CART (method = 'rpartCost'
)
For classification using package rpart with tuning parameters:
Complexity Parameter (cp
, numeric)
Cost (Cost
, numeric)
Cubist (method = 'cubist'
)
For regression using package Cubist with tuning parameters:
Number of Committees (committees
, numeric)
Number of Instances (neighbors
, numeric)
Cumulative Probability Model for Ordinal Data (method = 'vglmCumulative'
)
For classification using package VGAM with tuning parameters:
Parallel Curves (parallel
, logical)
Link Function (link
, character)
DeepBoost (method = 'deepboost'
)
For classification using package deepboost with tuning parameters:
Number of Boosting Iterations (num_iter
, numeric)
Tree Depth (tree_depth
, numeric)
L1 Regularization (beta
, numeric)
Tree Depth Regularization (lambda
, numeric)
Loss (loss_type
, character)
Diagonal Discriminant Analysis (method = 'dda'
)
For classification using package sparsediscrim with tuning parameters:
Model (model
, character)
Shrinkage Type (shrinkage
, character)
Distance Weighted Discrimination with Polynomial Kernel (method = 'dwdPoly'
)
For classification using package kerndwd with tuning parameters:
Regularization Parameter (lambda
, numeric)
q (qval
, numeric)
Polynomial Degree (degree
, numeric)
Scale (scale
, numeric)
Distance Weighted Discrimination with Radial Basis Function Kernel (method = 'dwdRadial'
)
For classification using packages kernlab and kerndwd with tuning parameters:
Regularization Parameter (lambda
, numeric)
q (qval
, numeric)
Sigma (sigma
, numeric)
Dynamic Evolving Neural-Fuzzy Inference System (method = 'DENFIS'
)
For regression using package frbs with tuning parameters:
Threshold (Dthr
, numeric)
Max. Iterations (max.iter
, numeric)
Elasticnet (method = 'enet'
)
For regression using package elasticnet with tuning parameters:
Fraction of Full Solution (fraction
, numeric)
Weight Decay (lambda
, numeric)
Ensemble Partial Least Squares Regression (method = 'enpls'
)
For regression using package enpls with tuning parameters:
Max. Number of Components (maxcomp
, numeric)
Ensemble Partial Least Squares Regression with Feature Selection (method = 'enpls.fs'
)
For regression using package enpls with tuning parameters:
Max. Number of Components (maxcomp
, numeric)
Importance Cutoff (threshold
, numeric)
Ensembles of Generalized Lienar Models (method = 'randomGLM'
)
For classification and regression using package randomGLM with tuning parameters:
Interaction Order (maxInteractionOrder
, numeric)
eXtreme Gradient Boosting (method = 'xgbLinear'
)
For classification and regression using package xgboost with tuning parameters:
Number of Boosting Iterations (nrounds
, numeric)
L2 Regularization (lambda
, numeric)
L1 Regularization (alpha
, numeric)
Learning Rate (eta
, numeric)
eXtreme Gradient Boosting (method = 'xgbTree'
)
For classification and regression using packages xgboost and plyr with tuning parameters:
Number of Boosting Iterations (nrounds
, numeric)
Max Tree Depth (max_depth
, numeric)
Shrinkage (eta
, numeric)
Minimum Loss Reduction (gamma
, numeric)
Subsample Ratio of Columns (colsample_bytree
, numeric)
Minimum Sum of Instance Weight (min_child_weight
, numeric)
Subsample Percentage (subsample
, numeric)
Extreme Learning Machine (method = 'elm'
)
For classification and regression using package elmNN with tuning parameters:
Number of Hidden Units (nhid
, numeric)
Activation Function (actfun
, character)
Factor-Based Linear Discriminant Analysis (method = 'RFlda'
)
For classification using package HiDimDA with tuning parameters:
Number of Factors (q
, numeric)
Flexible Discriminant Analysis (method = 'fda'
)
For classification using packages earth and mda with tuning parameters:
Product Degree (degree
, numeric)
Number of Terms (nprune
, numeric)
Fuzzy Inference Rules by Descent Method (method = 'FIR.DM'
)
For regression using package frbs with tuning parameters:
Number of Fuzzy Terms (num.labels
, numeric)
Max. Iterations (max.iter
, numeric)
Fuzzy Rules Using Chi's Method (method = 'FRBCS.CHI'
)
For classification using package frbs with tuning parameters:
Number of Fuzzy Terms (num.labels
, numeric)
Membership Function (type.mf
, character)
Fuzzy Rules Using Genetic Cooperative-Competitive Learning (method = 'GFS.GCCL'
)
For classification using package frbs with tuning parameters:
Number of Fuzzy Terms (num.labels
, numeric)
Population Size (popu.size
, numeric)
Max. Generations (max.gen
, numeric)
Fuzzy Rules Using Genetic Cooperative-Competitive Learning and Pittsburgh (method = 'FH.GBML'
)
For classification using package frbs with tuning parameters:
Max. Number of Rules (max.num.rule
, numeric)
Population Size (popu.size
, numeric)
Max. Generations (max.gen
, numeric)
Fuzzy Rules Using the Structural Learning Algorithm on Vague Environment (method = 'SLAVE'
)
For classification using package frbs with tuning parameters:
Number of Fuzzy Terms (num.labels
, numeric)
Max. Iterations (max.iter
, numeric)
Max. Generations (max.gen
, numeric)
Fuzzy Rules via MOGUL (method = 'GFS.FR.MOGUL'
)
For regression using package frbs with tuning parameters:
Max. Generations (max.gen
, numeric)
Max. Iterations (max.iter
, numeric)
Max. Tuning Iterations (max.tune
, numeric)
Fuzzy Rules via Thrift (method = 'GFS.THRIFT'
)
For regression using package frbs with tuning parameters:
Population Size (popu.size
, numeric)
Number of Fuzzy Labels (num.labels
, numeric)
Max. Generations (max.gen
, numeric)
Fuzzy Rules with Weight Factor (method = 'FRBCS.W'
)
For classification using package frbs with tuning parameters:
Number of Fuzzy Terms (num.labels
, numeric)
Membership Function (type.mf
, character)
Gaussian Process (method = 'gaussprLinear'
)
For classification and regression using package kernlab with no tuning parameters
Gaussian Process with Polynomial Kernel (method = 'gaussprPoly'
)
For classification and regression using package kernlab with tuning parameters:
Polynomial Degree (degree
, numeric)
Scale (scale
, numeric)
Gaussian Process with Radial Basis Function Kernel (method = 'gaussprRadial'
)
For classification and regression using package kernlab with tuning parameters:
Sigma (sigma
, numeric)
Generalized Additive Model using LOESS (method = 'gamLoess'
)
For classification and regression using package gam with tuning parameters:
Span (span
, numeric)
Degree (degree
, numeric)
Generalized Additive Model using Splines (method = 'bam'
)
For classification and regression using package mgcv with tuning parameters:
Feature Selection (select
, logical)
Method (method
, character)
Generalized Additive Model using Splines (method = 'gam'
)
For classification and regression using package mgcv with tuning parameters:
Feature Selection (select
, logical)
Method (method
, character)
Generalized Additive Model using Splines (method = 'gamSpline'
)
For classification and regression using package gam with tuning parameters:
Degrees of Freedom (df
, numeric)
Generalized Linear Model (method = 'glm'
)
For classification and regression with no tuning parameters
Generalized Linear Model with Stepwise Feature Selection (method = 'glmStepAIC'
)
For classification and regression using package MASS with no tuning parameters
Generalized Partial Least Squares (method = 'gpls'
)
For classification using package gpls with tuning parameters:
Number of Components (K.prov
, numeric)
Genetic Lateral Tuning and Rule Selection of Linguistic Fuzzy Systems (method = 'GFS.LT.RS'
)
For regression using package frbs with tuning parameters:
Population Size (popu.size
, numeric)
Number of Fuzzy Labels (num.labels
, numeric)
Max. Generations (max.gen
, numeric)
glmnet (method = 'gbm_h2o'
)
For classification and regression using package h2o with tuning parameters:
Number of Boosting Iterations (ntrees
, numeric)
Max Tree Depth (max_depth
, numeric)
Min. Terminal Node Size (min_rows
, numeric)
Shrinkage (learn_rate
, numeric)
Number of Randomly Selected Predictors (col_sample_rate
, numeric)
glmnet (method = 'glmnet_h2o'
)
For classification and regression using package h2o with tuning parameters:
Mixing Percentage (alpha
, numeric)
Regularization Parameter (lambda
, numeric)
glmnet (method = 'glmnet'
)
For classification and regression using packages glmnet and Matrix with tuning parameters:
Mixing Percentage (alpha
, numeric)
Regularization Parameter (lambda
, numeric)
Greedy Prototype Selection (method = 'protoclass'
)
For classification using packages proxy and protoclass with tuning parameters:
Ball Size (eps
, numeric)
Distance Order (Minkowski
, numeric)
Heteroscedastic Discriminant Analysis (method = 'hda'
)
For classification using package hda with tuning parameters:
Gamma (gamma
, numeric)
Lambda (lambda
, numeric)
Dimension of the Discriminative Subspace (newdim
, numeric)
High Dimensional Discriminant Analysis (method = 'hdda'
)
For classification using package HDclassif with tuning parameters:
Threshold (threshold
, character)
Model Type (model
, numeric)
High-Dimensional Regularized Discriminant Analysis (method = 'hdrda'
)
For classification using package sparsediscrim with tuning parameters:
Gamma (gamma
, numeric)
Lambda (lambda
, numeric)
Shrinkage Type (shrinkage_type
, character)
Hybrid Neural Fuzzy Inference System (method = 'HYFIS'
)
For regression using package frbs with tuning parameters:
Number of Fuzzy Terms (num.labels
, numeric)
Max. Iterations (max.iter
, numeric)
Independent Component Regression (method = 'icr'
)
For regression using package fastICA with tuning parameters:
Number of Components (n.comp
, numeric)
k-Nearest Neighbors (method = 'kknn'
)
For classification and regression using package kknn with tuning parameters:
Max. Number of Neighbors (kmax
, numeric)
Distance (distance
, numeric)
Kernel (kernel
, character)
k-Nearest Neighbors (method = 'knn'
)
For classification and regression with tuning parameters:
Number of Neighbors (k
, numeric)
Knn regression via sklearn.neighbors.KNeighborsRegressor (method = 'pythonKnnReg'
)
For regression using package rPython with tuning parameters:
Number of Neighbors (n_neighbors
, numeric)
Weight Function (weights
, character)
Algorithm (algorithm
, character)
Leaf Size (leaf_size
, numeric)
Distance Metric (metric
, character)
p (p
, numeric)
L2 Regularized Linear Support Vector Machines with Class Weights (method = 'svmLinearWeights2'
)
For classification using package LiblineaR with tuning parameters:
Cost (cost
, numeric)
Loss Function (Loss
, character)
Class Weight (weight
, numeric)
L2 Regularized Support Vector Machine (dual) with Linear Kernel (method = 'svmLinear3'
)
For classification and regression using package LiblineaR with tuning parameters:
Cost (cost
, numeric)
Loss Function (Loss
, character)
Learning Vector Quantization (method = 'lvq'
)
For classification using package class with tuning parameters:
Codebook Size (size
, numeric)
Number of Prototypes (k
, numeric)
Least Angle Regression (method = 'lars'
)
For regression using package lars with tuning parameters:
Fraction (fraction
, numeric)
Least Angle Regression (method = 'lars2'
)
For regression using package lars with tuning parameters:
Number of Steps (step
, numeric)
Least Squares Support Vector Machine (method = 'lssvmLinear'
)
For classification using package kernlab with tuning parameters:
Regularization Parameter (tau
, numeric)
Least Squares Support Vector Machine with Polynomial Kernel (method = 'lssvmPoly'
)
For classification using package kernlab with tuning parameters:
Polynomial Degree (degree
, numeric)
Scale (scale
, numeric)
Regularization Parameter (tau
, numeric)
Least Squares Support Vector Machine with Radial Basis Function Kernel (method = 'lssvmRadial'
)
For classification using package kernlab with tuning parameters:
Sigma (sigma
, numeric)
Regularization Parameter (tau
, numeric)
Linear Discriminant Analysis (method = 'lda'
)
For classification using package MASS with no tuning parameters
Linear Discriminant Analysis (method = 'lda2'
)
For classification using package MASS with tuning parameters:
Number of Discriminant Functions (dimen
, numeric)
Linear Discriminant Analysis with Stepwise Feature Selection (method = 'stepLDA'
)
For classification using packages klaR and MASS with tuning parameters:
Maximum Number of Variables (maxvar
, numeric)
Search Direction (direction
, character)
Linear Distance Weighted Discrimination (method = 'dwdLinear'
)
For classification using package kerndwd with tuning parameters:
Regularization Parameter (lambda
, numeric)
q (qval
, numeric)
Linear Regression (method = 'lm'
)
For regression with tuning parameters:
intercept (intercept
, logical)
Linear Regression with Backwards Selection (method = 'leapBackward'
)
For regression using package leaps with tuning parameters:
Maximum Number of Predictors (nvmax
, numeric)
Linear Regression with Forward Selection (method = 'leapForward'
)
For regression using package leaps with tuning parameters:
Maximum Number of Predictors (nvmax
, numeric)
Linear Regression with Stepwise Selection (method = 'leapSeq'
)
For regression using package leaps with tuning parameters:
Maximum Number of Predictors (nvmax
, numeric)
Linear Regression with Stepwise Selection (method = 'lmStepAIC'
)
For regression using package MASS with no tuning parameters
Linear Support Vector Machines with Class Weights (method = 'svmLinearWeights'
)
For classification using package e1071 with tuning parameters:
Cost (cost
, numeric)
Class Weight (weight
, numeric)
Localized Linear Discriminant Analysis (method = 'loclda'
)
For classification using package klaR with tuning parameters:
Number of Nearest Neighbors (k
, numeric)
Logic Regression (method = 'logreg'
)
For classification and regression using package LogicReg with tuning parameters:
Maximum Number of Leaves (treesize
, numeric)
Number of Trees (ntrees
, numeric)
Logistic Model Trees (method = 'LMT'
)
For classification using package RWeka with tuning parameters:
Number of Iteratons (iter
, numeric)
Maximum Uncertainty Linear Discriminant Analysis (method = 'Mlda'
)
For classification using package HiDimDA with no tuning parameters
Mixture Discriminant Analysis (method = 'mda'
)
For classification using package mda with tuning parameters:
Number of Subclasses Per Class (subclasses
, numeric)
Model Averaged Naive Bayes Classifier (method = 'manb'
)
For classification using package bnclassify with tuning parameters:
Smoothing Parameter (smooth
, numeric)
Prior Probability (prior
, numeric)
Model Averaged Neural Network (method = 'avNNet'
)
For classification and regression using package nnet with tuning parameters:
Number of Hidden Units (size
, numeric)
Weight Decay (decay
, numeric)
Bagging (bag
, logical)
Model Rules (method = 'M5Rules'
)
For regression using package RWeka with tuning parameters:
Pruned (pruned
, character)
Smoothed (smoothed
, character)
Model Tree (method = 'M5'
)
For regression using package RWeka with tuning parameters:
Pruned (pruned
, character)
Smoothed (smoothed
, character)
Rules (rules
, character)
Multi-Layer Perceptron (method = 'mlp'
)
For classification and regression using package RSNNS with tuning parameters:
Number of Hidden Units (size
, numeric)
Multi-Layer Perceptron (method = 'mlpWeightDecay'
)
For classification and regression using package RSNNS with tuning parameters:
Number of Hidden Units (size
, numeric)
Weight Decay (decay
, numeric)
Multi-Layer Perceptron, multiple layers (method = 'mlpWeightDecayML'
)
For classification and regression using package RSNNS with tuning parameters:
Number of Hidden Units layer1 (layer1
, numeric)
Number of Hidden Units layer2 (layer2
, numeric)
Number of Hidden Units layer3 (layer3
, numeric)
Weight Decay (decay
, numeric)
Multi-Layer Perceptron, with multiple layers (method = 'mlpML'
)
For classification and regression using package RSNNS with tuning parameters:
Number of Hidden Units layer1 (layer1
, numeric)
Number of Hidden Units layer2 (layer2
, numeric)
Number of Hidden Units layer3 (layer3
, numeric)
Multilayer Perceptron Network by Stochastic Gradient Descent (method = 'mlpSGD'
)
For classification and regression using packages FCNN4R and plyr with tuning parameters:
Number of Hidden Units (size
, numeric)
L2 Regularization (l2reg
, numeric)
RMSE Gradient Scaling (lambda
, numeric)
Learning Rate (learn_rate
, numeric)
Momentum (momentum
, numeric)
Decay (gamma
, numeric)
Batch Size (minibatchsz
, numeric)
Number of Models (repeats
, numeric)
Multivariate Adaptive Regression Spline (method = 'earth'
)
For classification and regression using package earth with tuning parameters:
Number of Terms (nprune
, numeric)
Product Degree (degree
, numeric)
Multivariate Adaptive Regression Splines (method = 'gcvEarth'
)
For classification and regression using package earth with tuning parameters:
Product Degree (degree
, numeric)
Naive Bayes (method = 'nb'
)
For classification using package klaR with tuning parameters:
Laplace Correction (fL
, numeric)
Distribution Type (usekernel
, logical)
Bandwidth Adjustment (adjust
, numeric)
Naive Bayes Classifier (method = 'nbDiscrete'
)
For classification using package bnclassify with tuning parameters:
Smoothing Parameter (smooth
, numeric)
Naive Bayes Classifier with Attribute Weighting (method = 'awnb'
)
For classification using package bnclassify with tuning parameters:
Smoothing Parameter (smooth
, numeric)
Nearest Shrunken Centroids (method = 'pam'
)
For classification using package pamr with tuning parameters:
Shrinkage Threshold (threshold
, numeric)
Negative Binomial Generalized Linear Model (method = 'glm.nb'
)
For regression with tuning parameters:
Link Function (link
, character)
Neural Network (method = 'neuralnet'
)
For regression using package neuralnet with tuning parameters:
Number of Hidden Units in Layer 1 (layer1
, numeric)
Number of Hidden Units in Layer 2 (layer2
, numeric)
Number of Hidden Units in Layer 3 (layer3
, numeric)
Neural Network (method = 'nnet'
)
For classification and regression using package nnet with tuning parameters:
Number of Hidden Units (size
, numeric)
Weight Decay (decay
, numeric)
Neural Networks with Feature Extraction (method = 'pcaNNet'
)
For classification and regression using package nnet with tuning parameters:
Number of Hidden Units (size
, numeric)
Weight Decay (decay
, numeric)
Non-Convex Penalized Quantile Regression (method = 'rqnc'
)
For regression using package rqPen with tuning parameters:
L1 Penalty (lambda
, numeric)
Penalty Type (penalty
, character)
Non-Negative Least Squares (method = 'nnls'
)
For regression using package nnls with no tuning parameters
Oblique Random Forest (method = 'ORFlog'
)
For classification using package obliqueRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Oblique Random Forest (method = 'ORFpls'
)
For classification using package obliqueRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Oblique Random Forest (method = 'ORFridge'
)
For classification using package obliqueRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Oblique Random Forest (method = 'ORFsvm'
)
For classification using package obliqueRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Oblique Trees (method = 'oblique.tree'
)
For classification using package oblique.tree with tuning parameters:
Oblique Splits (oblique.splits
, character)
Variable Selection Method (variable.selection
, character)
Optimal Weighted Nearest Neighbor Classifier (method = 'ownn'
)
For classification using package snn with tuning parameters:
Number of Neighbors (K
, numeric)
Ordered Logistic or Probit Regression (method = 'polr'
)
For classification using package MASS with tuning parameters:
parameter (method
, character)
Parallel Random Forest (method = 'parRF'
)
For classification and regression using packages e1071, randomForest and foreach with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
partDSA (method = 'partDSA'
)
For classification and regression using package partDSA with tuning parameters:
Number of Terminal Partitions (cut.off.growth
, numeric)
Minimum Percent Difference (MPD
, numeric)
Partial Least Squares (method = 'kernelpls'
)
For classification and regression using package pls with tuning parameters:
Number of Components (ncomp
, numeric)
Partial Least Squares (method = 'pls'
)
For classification and regression using package pls with tuning parameters:
Number of Components (ncomp
, numeric)
Partial Least Squares (method = 'simpls'
)
For classification and regression using package pls with tuning parameters:
Number of Components (ncomp
, numeric)
Partial Least Squares (method = 'widekernelpls'
)
For classification and regression using package pls with tuning parameters:
Number of Components (ncomp
, numeric)
Partial Least Squares Generalized Linear Models (method = 'plsRglm'
)
For classification and regression using package plsRglm with tuning parameters:
Number of PLS Components (nt
, numeric)
p-Value threshold (alpha.pvals.expli
, numeric)
Penalized Discriminant Analysis (method = 'pda'
)
For classification using package mda with tuning parameters:
Shrinkage Penalty Coefficient (lambda
, numeric)
Penalized Discriminant Analysis (method = 'pda2'
)
For classification using package mda with tuning parameters:
Degrees of Freedom (df
, numeric)
Penalized Linear Discriminant Analysis (method = 'PenalizedLDA'
)
For classification using packages penalizedLDA and plyr with tuning parameters:
L1 Penalty (lambda
, numeric)
Number of Discriminant Functions (K
, numeric)
Penalized Linear Regression (method = 'penalized'
)
For regression using package penalized with tuning parameters:
L1 Penalty (lambda1
, numeric)
L2 Penalty (lambda2
, numeric)
Penalized Logistic Regression (method = 'plr'
)
For classification using package stepPlr with tuning parameters:
L2 Penalty (lambda
, numeric)
Complexity Parameter (cp
, character)
Penalized Multinomial Regression (method = 'multinom'
)
For classification using package nnet with tuning parameters:
Weight Decay (decay
, numeric)
Penalized Ordinal Regression (method = 'ordinalNet'
)
For classification and regression using packages ordinalNet and plyr with tuning parameters:
Mixing Percentage (alpha
, numeric)
Selection Criterion (criteria
, character)
Link Function (link
, character)
Polynomial Kernel Regularized Least Squares (method = 'krlsPoly'
)
For regression using package KRLS with tuning parameters:
Regularization Parameter (lambda
, numeric)
Polynomial Degree (degree
, numeric)
Principal Component Analysis (method = 'pcr'
)
For regression using package pls with tuning parameters:
Number of Components (ncomp
, numeric)
Projection Pursuit Regression (method = 'ppr'
)
For regression with tuning parameters:
Number of Terms (nterms
, numeric)
Quadratic Discriminant Analysis (method = 'qda'
)
For classification using package MASS with no tuning parameters
Quadratic Discriminant Analysis with Stepwise Feature Selection (method = 'stepQDA'
)
For classification using packages klaR and MASS with tuning parameters:
Maximum Number of Variables (maxvar
, numeric)
Search Direction (direction
, character)
Quantile Random Forest (method = 'qrf'
)
For regression using package quantregForest with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Quantile Regression Neural Network (method = 'qrnn'
)
For regression using package qrnn with tuning parameters:
Number of Hidden Units (n.hidden
, numeric)
Weight Decay (penalty
, numeric)
Bagged Models? (bag
, logical)
Quantile Regression with LASSO penalty (method = 'rqlasso'
)
For regression using package rqPen with tuning parameters:
L1 Penalty (lambda
, numeric)
Radial Basis Function Kernel Regularized Least Squares (method = 'krlsRadial'
)
For regression using packages KRLS and kernlab with tuning parameters:
Regularization Parameter (lambda
, numeric)
Sigma (sigma
, numeric)
Radial Basis Function Network (method = 'rbf'
)
For classification and regression using package RSNNS with tuning parameters:
Number of Hidden Units (size
, numeric)
Radial Basis Function Network (method = 'rbfDDA'
)
For classification and regression using package RSNNS with tuning parameters:
Activation Limit for Conflicting Classes (negativeThreshold
, numeric)
Random Ferns (method = 'rFerns'
)
For classification using package rFerns with tuning parameters:
Fern Depth (depth
, numeric)
Random Forest (method = 'ranger'
)
For classification and regression using packages e1071 and ranger with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Random Forest (method = 'Rborist'
)
For classification and regression using package Rborist with tuning parameters:
Number of Randomly Selected Predictors (predFixed
, numeric)
Random Forest (method = 'rf'
)
For classification and regression using package randomForest with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Random Forest by Randomization (method = 'extraTrees'
)
For classification and regression using package extraTrees with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Number of Random Cuts (numRandomCuts
, numeric)
Random Forest Rule-Based Model (method = 'rfRules'
)
For classification and regression using packages randomForest, inTrees and plyr with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Maximum Rule Depth (maxdepth
, numeric)
Random Forest with Additional Feature Selection (method = 'Boruta'
)
For classification and regression using packages Boruta and randomForest with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Regularized Discriminant Analysis (method = 'rda'
)
For classification using package klaR with tuning parameters:
Gamma (gamma
, numeric)
Lambda (lambda
, numeric)
Regularized Linear Discriminant Analysis (method = 'rlda'
)
For classification using package sparsediscrim with tuning parameters:
Regularization Method (estimator
, character)
Regularized Random Forest (method = 'RRF'
)
For classification and regression using packages randomForest and RRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Regularization Value (coefReg
, numeric)
Importance Coefficient (coefImp
, numeric)
Regularized Random Forest (method = 'RRFglobal'
)
For classification and regression using package RRF with tuning parameters:
Number of Randomly Selected Predictors (mtry
, numeric)
Regularization Value (coefReg
, numeric)
Relaxed Lasso (method = 'relaxo'
)
For regression using packages relaxo and plyr with tuning parameters:
Penalty Parameter (lambda
, numeric)
Relaxation Parameter (phi
, numeric)
Relevance Vector Machines with Linear Kernel (method = 'rvmLinear'
)
For regression using package kernlab with no tuning parameters
Relevance Vector Machines with Polynomial Kernel (method = 'rvmPoly'
)
For regression using package kernlab with tuning parameters:
Scale (scale
, numeric)
Polynomial Degree (degree
, numeric)
Relevance Vector Machines with Radial Basis Function Kernel (method = 'rvmRadial'
)
For regression using package kernlab with tuning parameters:
Sigma (sigma
, numeric)
Ridge Regression (method = 'ridge'
)
For regression using package elasticnet with tuning parameters:
Weight Decay (lambda
, numeric)
Ridge Regression with Variable Selection (method = 'foba'
)
For regression using package foba with tuning parameters:
Number of Variables Retained (k
, numeric)
L2 Penalty (lambda
, numeric)
Robust Linear Discriminant Analysis (method = 'Linda'
)
For classification using package rrcov with no tuning parameters
Robust Linear Model (method = 'rlm'
)
For regression using package MASS with tuning parameters:
intercept (intercept
, logical)
psi (psi
, character)
Robust Mixture Discriminant Analysis (method = 'rmda'
)
For classification using package robustDA with tuning parameters:
Number of Subclasses Per Class (K
, numeric)
Model (model
, character)
Robust Quadratic Discriminant Analysis (method = 'QdaCov'
)
For classification using package rrcov with no tuning parameters
Robust Regularized Linear Discriminant Analysis (method = 'rrlda'
)
For classification using package rrlda with tuning parameters:
Penalty Parameter (lambda
, numeric)
Robustness Parameter (hp
, numeric)
Penalty Type (penalty
, character)
Robust SIMCA (method = 'RSimca'
)
For classification using package rrcovHD with no tuning parameters
ROC-Based Classifier (method = 'rocc'
)
For classification using package rocc with tuning parameters:
Number of Variables Retained (xgenes
, numeric)
Rotation Forest (method = 'rotationForest'
)
For classification using package rotationForest with tuning parameters:
Number of Variable Subsets (K
, numeric)
Ensemble Size (L
, numeric)
Rotation Forest (method = 'rotationForestCp'
)
For classification using packages rpart, plyr and rotationForest with tuning parameters:
Number of Variable Subsets (K
, numeric)
Ensemble Size (L
, numeric)
Complexity Parameter (cp
, numeric)
Rule-Based Classifier (method = 'JRip'
)
For classification using package RWeka with tuning parameters:
Number of Optimizations (NumOpt
, numeric)
Number of Folds (NumFolds
, numeric)
Min Weights (MinWeights
, numeric)
Rule-Based Classifier (method = 'PART'
)
For classification using package RWeka with tuning parameters:
Confidence Threshold (threshold
, numeric)
Pruning (pruned
, character)
Self-Organizing Map (method = 'bdk'
)
For classification and regression using package kohonen with tuning parameters:
Row (xdim
, numeric)
Columns (ydim
, numeric)
X Weight (xweight
, numeric)
Topology (topo
, character)
Self-Organizing Maps (method = 'xyf'
)
For classification and regression using package kohonen with tuning parameters:
Row (xdim
, numeric)
Columns (ydim
, numeric)
X Weight (xweight
, numeric)
Topology (topo
, character)
Semi-Naive Structure Learner Wrapper (method = 'nbSearch'
)
For classification using package bnclassify with tuning parameters:
Number of Folds (k
, numeric)
Minimum Absolute Improvement (epsilon
, numeric)
Smoothing Parameter (smooth
, numeric)
Final Smoothing Parameter (final_smooth
, numeric)
Search Direction (direction
, character)
Shrinkage Discriminant Analysis (method = 'sda'
)
For classification using package sda with tuning parameters:
Diagonalize (diagonal
, logical)
shrinkage (lambda
, numeric)
SIMCA (method = 'CSimca'
)
For classification using package rrcovHD with no tuning parameters
Simplified TSK Fuzzy Rules (method = 'FS.HGD'
)
For regression using package frbs with tuning parameters:
Number of Fuzzy Terms (num.labels
, numeric)
Max. Iterations (max.iter
, numeric)
Single C5.0 Ruleset (method = 'C5.0Rules'
)
For classification using package C50 with no tuning parameters
Single C5.0 Tree (method = 'C5.0Tree'
)
For classification using package C50 with no tuning parameters
Single Rule Classification (method = 'OneR'
)
For classification using package RWeka with no tuning parameters
Sparse Distance Weighted Discrimination (method = 'sdwd'
)
For classification using package sdwd with tuning parameters:
L1 Penalty (lambda
, numeric)
L2 Penalty (lambda2
, numeric)
Sparse Linear Discriminant Analysis (method = 'sparseLDA'
)
For classification using package sparseLDA with tuning
``Using your own model in train
'' (https://topepo.github.io/caret/using-your-own-model-in-train.html)