powered by
An implementation of the random forest and bagging ensemble algorithms utilizing conditional inference trees as base learners.
CForestModel( teststat = c("quad", "max"), testtype = c("Univariate", "Teststatistic", "Bonferroni", "MonteCarlo"), mincriterion = 0, ntree = 500, mtry = 5, replace = TRUE, fraction = 0.632 )
character specifying the type of the test statistic to be applied.
character specifying how to compute the distribution of the test statistic.
value of the test statistic that must be exceeded in order to implement a split.
number of trees to grow in a forest.
number of input variables randomly sampled as candidates at each node for random forest like algorithms.
logical indicating whether sampling of observations is done with or without replacement.
fraction of number of observations to draw without replacement (only relevant if replace = FALSE).
replace = FALSE
MLModel class object.
MLModel
factor, numeric, Surv
factor
numeric
Surv
mtry
Supplied arguments are passed to cforest_control. Further model details can be found in the source link below.
cforest_control
cforest, fit, resample
cforest
fit
resample
# NOT RUN { fit(sale_amount ~ ., data = ICHomes, model = CForestModel) # }
Run the code above in your browser using DataLab