CVhybridEnsemble
cross-validates (five times twofold) (hybridEnsemble
) and computes performance statistics that can be plotted (plot.CVhybridEnsemble
) and summarized (summary.CVhybridEnsemble
).
CVhybridEnsemble(
x = NULL,
y = NULL,
algorithms = c("LR", "RF", "AB", "KF", "NN", "SV", "RoF", "KN", "NB"),
combine = NULL,
eval.measure = "auc",
diversity = FALSE,
parallel = FALSE,
verbose = FALSE,
oversample = TRUE,
calibrate = FALSE,
filter = 0.03,
LR.size = 10,
RF.ntree = 500,
AB.iter = 500,
AB.maxdepth = 3,
KF.cp = 1,
KF.rp = round(log(nrow(x), 10)),
KF.ntree = 500,
NN.rang = 0.1,
NN.maxit = 10000,
NN.size = c(5, 10, 20),
NN.decay = c(0, 0.001, 0.01, 0.1),
NN.skip = c(TRUE, FALSE),
NN.ens.size = 10,
SV.gamma = 2^(-15:3),
SV.cost = 2^(-5:13),
SV.degree = c(2, 3),
SV.kernel = c("radial", "sigmoid", "linear", "polynomial"),
SV.size = 10,
RoF.L = 10,
KN.K = c(1:150),
KN.size = 10,
NB.size = 10,
rbga.popSize = length(algorithms) * 14,
rbga.iters = 500,
rbga.mutationChance = 1/rbga.popSize,
rbga.elitism = max(1, round(rbga.popSize * 0.05)),
DEopt.nP = 20,
DEopt.nG = 500,
DEopt.F = 0.9314,
DEopt.CR = 0.6938,
GenSA.maxit = 500,
GenSA.temperature = 0.5,
GenSA.visiting.param = 2.7,
GenSA.acceptance.param = -5,
GenSA.max.call = 1e+07,
malschains.popsize = 60,
malschains.ls = "cmaes",
malschains.istep = 300,
malschains.effort = 0.5,
malschains.alpha = 0.5,
malschains.threshold = 1e-08,
malschains.maxEvals = 500,
psoptim.maxit = 500,
psoptim.maxf = Inf,
psoptim.abstol = -Inf,
psoptim.reltol = 0,
psoptim.s = 40,
psoptim.k = 3,
psoptim.p = 1 - (1 - 1/psoptim.s)^psoptim.k,
psoptim.w = 1/(2 * log(2)),
psoptim.c.p = 0.5 + log(2),
psoptim.c.g = 0.5 + log(2),
soma.pathLength = 3,
soma.stepLength = 0.11,
soma.perturbationChance = 0.1,
soma.minAbsoluteSep = 0,
soma.minRelativeSep = 0.001,
soma.nMigrations = 500,
soma.populationSize = 10,
tabu.iters = 500,
tabu.listSize = c(5:12)
)
A list of class CVhybridEnsemble
containing the following elements:
For the simple mean combination method: A list containing the median and inter quartile range of the performance evaluations, the performance evaluations on each fold, and the predictions and reponse vectors for each fold.
For the authority combination method: A list containing the median and inter quartile range of the performance evaluations, the performance evaluations on each fold, and the predictions and reponse vectors for each fold.
For the single best: A list containing the median and inter quartile range of the performance evaluations, the performance evaluations on each fold, and the predictions and reponse vectors for each fold.
..and all the combination methods that are requested.
The performance measure that was used
Data frame containing the diversity (1 minus the absolute value of the mean of the pairwise correlations), and mean auc and accuracy(threshold=0.5) of the hybrid ensemble and the sub-ensembles.
A data frame of predictors. Categorical variables need to be transformed to binary (dummy) factors.
A factor of observed class labels (responses) with the only allowed values {0,1}.,
Which algorihtms to use {"LR","RF","AB","KF","NN","SV","RoF","KN","NB"}. LR= Bagged Logistic Regression, RF=Random Forest, AB= AdaBoost, KF= Kernel Factory, NN= Bagged Neural Network, SV= Bagged Support Vector Machines, RoF= Rotation Forest, KN= Bagged K- Nearest Neighbors, NB= Bagged Naive Bayes.
Additional methods for combining the sub-ensembles. The simple mean, authority-based weighting and the single best are automatically provided since they are very effficient. Possible additional methods: Genetic Algorithm: "rbga", Differential Evolutionary Algorithm: "DEopt", Generalized Simulated Annealing: "GenSA", Memetic Algorithm with Local Search Chains: "malschains", Particle Swarm Optimization: "psoptim", Self-Organising Migrating Algorithm: "soma", Tabu Search Algorithm: "tabu", Non-negative binomial likelihood: "NNloglik", Goldfarb-Idnani Non-negative least squares: "GINNLS", Lawson-Hanson Non-negative least squares: "LHNNLS".
Evaluation measure for the following combination methods: authority-based method, single best, "rbga", "DEopt", "GenSA", "malschains", "psoptim", "soma", "tabu". Default is the area under the receiver operator characteristic curve 'auc'. The area under the sensitivity curve ('sens') and the area under the specificity curve ('spec') are also supported.
TRUE or FALSE. Will set predict.all=TRUE in hybridEnsemble
and compute diversity at the sub-ensemble and hybrid (i.e., meta) -ensemble level? Diversity is defined as 1 minus the absolute value of the mean of the pairwise correlations. The AUC will also be provided. For the AUC of the meta-ensemble the simple mean is used.
TRUE or FALSE. Should the cross-validation be executed in parallel. Will use all available cores.
TRUE or FALSE. Should information be printed to the screen while estimating the Hybrid Ensemble.
TRUE or FALSE. Should oversampling be used? Setting oversample to TRUE helps avoid computational problems related to the subsetting process.
TRUE or FALSE. If FALSE percentile ranks of the prediction vectors will be used.
either NULL (deactivate) or a percentage denoting the minimum class size of dummy predictors. This parameter is used to remove near constants. For example if nrow(xTRAIN)=100, and filter=0.01 then all dummy predictors with any class size equal to 1 will be removed. Set this higher (e.g., 0.05 or 0.10) in case of errors.
Logistic Regression parameter. Ensemble size of the bagged logistic regression sub-ensemble.
Random Forest parameter. Number of trees to grow.
Stochastic AdaBoost parameter. Number of boosting iterations to perform.
Stochastic AdaBoost parameter. The maximum depth of any node of the final tree, with the root node counted as depth 0.
Kernel Factory parameter. The number of column partitions.
Kernel Factory parameter. The number of row partitions.
Kernel Factory parameter. Number of trees to grow.
Neural Network parameter. Initial random weights on [-rang, rang].
Neural Network parameter. Maximum number of iterations.
Neural Network parameter. Number of units in the single hidden layer. Can be mutiple values that need to be optimized.
Neural Network parameter. Weight decay. Can be mutiple values that need to be optimized.
Neural Network parameter. Switch to add skip-layer connections from input to output. Can be boolean vector (TRUE and FALSE) for optimization.
Neural Network parameter. Ensemble size of the neural network sub-ensemble.
Support Vector Machines parameter. Width of the Guassian for radial basis and sigmoid kernel. Can be mutiple values that need to be optimized.
Support Vector Machines parameter. Penalty (soft margin constant). Can be mutiple values that need to be optimized.
Support Vector Machines parameter. Degree of the polynomial kernel. Can be mutiple values that need to be optimized.
Support Vector Machines parameter. Kernels to try. Can be one or more of: 'radial','sigmoid','linear','polynomial'. Can be mutiple values that need to be optimized.
Support Vector Machines parameter. Ensemble size of the SVM sub-ensemble.
Rotation Forest parameter. Number of trees to grow.
K-Nearest Neighbors parameter. Number of nearest neighbors to try. For example c(10,20,30). The optimal K will be selected. If larger than nrow(xTRAIN) the maximum K will be reset to 50% of nrow(xTRAIN). Can be mutiple values that need to be optimized.
K-Nearest Neighbors parameter. Ensemble size of the K-nearest neighbor sub-ensemble.
Naive Bayes parameter. Ensemble size of the bagged naive bayes sub-ensemble.
Genetic Algorithm parameter. Population size. Default is 14 times the number of variables.
Genetic Algorithm parameter. Number of iterations.
Genetic Algorithm parameter. The chance that a gene in the chromosome mutates.
Genetic Algorithm parameter. Number of chromosomes that are kept into the next generation.
Differential Evolutionary Algorithm parameter. Population size.
Differential Evolutionary Algorithm parameter. Number of generations.
Differential Evolutionary Algorithm parameter. Step size.
Differential Evolutionary Algorithm parameter. Probability of crossover.
Generalized Simulated Annealing. Maximum number of iterations.
Generalized Simulated Annealing. Initial value for temperature.
Generalized Simulated Annealing. Parameter for visiting distribution.
Generalized Simulated Annealing. Parameter for acceptance distribution.
Generalized Simulated Annealing. Maximum number of calls of the objective function.
Memetic Algorithm with Local Search Chains parameter. Population size.
Memetic Algorithm with Local Search Chains parameter. Local search method.
Memetic Algorithm with Local Search Chains parameter. Number of iterations of the local search.
Memetic Algorithm with Local Search Chains parameter. Value between 0 and 1. The ratio between the number of evaluations for the local search and for the evolutionary algorithm. A higher effort means more evaluations for the evolutionary algorithm.
Memetic Algorithm with Local Search Chains parameter. Crossover BLX-alpha. Lower values (<0.3) reduce diversity and a higher value increases diversity.
Memetic Algorithm with Local Search Chains parameter. Threshold that defines how much improvement in the local search is considered to be no improvement.
Memetic Algorithm with Local Search Chains parameter. Maximum number of evaluations.
Particle Swarm Optimization parameter. Maximum number of iterations.
Particle Swarm Optimization parameter. Maximum number of function evaluations.
Particle Swarm Optimization parameter. Absolute convergence tolerance.
Particle Swarm Optimization parameter. Tolerance for restarting.
Particle Swarm Optimization parameter. Swarm size.
Particle Swarm Optimization parameter. Exponent for calculating number of informants.
Particle Swarm Optimization parameter. Average percentage of informants for each particle.
Particle Swarm Optimization parameter. Exploitation constant.
Particle Swarm Optimization parameter. Local exploration constant.
Particle Swarm Optimization parameter. Global exploration constant.
Self-Organising Migrating Algorithm parameter. Distance (towards the leader) that individuals may migrate.
Self-Organising Migrating Algorithm parameter. Granularity at which potential steps are evaluated.
Self-Organising Migrating Algorithm parameter. Probability that individual parameters are changed on any given step.
Self-Organising Migrating Algorithm parameter. Smallest absolute difference between maximum and minimum cost function values. Below this minimum the algorithm will terminate.
Self-Organising Migrating Algorithm parameter. Smallest relative difference between maximum and minimum cost function values. Below this minimum the algorithm will terminate.
Self-Organising Migrating Algorithm parameter. Maximum number of migrations to complete.
Self-Organising Migrating Algorithm parameter. Population size.
Number of iterations in the preliminary search of the algorithm.
Tabu list size.
Michel Ballings, Dauwe Vercamer, Matthias Bogaert, and Dirk Van den Poel, Maintainer: Michel.Ballings@GMail.com
Ballings, M., Vercamer, D., Bogaert, M., Van den Poel, D.
hybridEnsemble
, predict.hybridEnsemble
, importance.hybridEnsemble
, plot.CVhybridEnsemble
, summary.CVhybridEnsemble
data(Credit)
if (FALSE) {
x <- Credit[1:200,names(Credit) != 'Response']
x <- x[,sapply(x,is.numeric)]
CVhE <- CVhybridEnsemble(x=x,
y=Credit$Response[1:200],
verbose=TRUE,
KF.rp=1,
RF.ntree=50,
AB.iter=50,
NN.size=5,
NN.decay=0,
SV.gamma = 2^-15,
SV.cost = 2^-5,
SV.degree=2,
SV.kernel='radial')
}
Run the code above in your browser using DataLab