Greedy wrapper algorithms for learning Bayesian network classifiers. All algorithms use cross-validated estimate of predictive accuracy to evaluate candidate structures.
fssj(class, dataset, k, epsilon = 0.01, smooth = 0, cache_reset = NULL)bsej(class, dataset, k, epsilon = 0.01, smooth = 0, cache_reset = NULL)
tan_hc(class, dataset, k, epsilon = 0.01, smooth = 0, cache_reset = NULL)
kdb(
class,
dataset,
k,
kdbk = 2,
epsilon = 0.01,
smooth = 0,
cache_reset = NULL
)
tan_hcsp(class, dataset, k, epsilon = 0.01, smooth = 0, cache_reset = NULL)
A bnc_dag
object.
A character. Name of the class variable.
The data frame from which to learn the classifier.
An integer. The number of folds.
A numeric. Minimum absolute improvement in accuracy required to keep searching.
A numeric. The smoothing value (\(\alpha\)) for Bayesian parameter estimation. Nonnegative.
A numeric. Number of iterations after which to reset the
cache of conditional probability tables. A small number reduces the amount
of memory used. NULL
means the cache is never reset (the default).
An integer. The maximum number of feature parents per feature.
Pazzani M (1996). Constructive induction of Cartesian product attributes. In Proceedings of the Information, Statistics and Induction in Science Conference (ISIS-1996), pp. 66-77
Koegh E and Pazzani M (2002).Learning the structure of augmented Bayesian classifiers. In International Journal on Artificial Intelligence Tools, 11(4), pp. 587-601.
data(car)
tanhc <- tan_hc('class', car, k = 5, epsilon = 0)
if (FALSE) plot(tanhc)
Run the code above in your browser using DataLab