Learn R Programming

fastAdaboost (version 1.0.0)

fastAdaboost: fastAdaboost: fast adaboost implementation for R

Description

fastAdaboost provides a blazingly fast implementation of both discrete and real adaboost algorithms, based on a C++ backend. The goal of the package is to provide fast performance for large in-memory data sets.

Arguments

References

Freund, Y. and Schapire, R.E. (1996):“Experiments with a new boosting algorithm” . In Proceedings of the Thirteenth International Conference on Machine Learning, pp. 148--156, Morgan Kaufmann.

Zhu, Ji, et al. “Multi-class adaboost” Ann Arbor 1001.48109 (2006): 1612.

Examples

Run this code
fakedata <- data.frame( X=c(rnorm(100,0,1),rnorm(100,1,1)), Y=c(rep(0,100),rep(1,100) ) )
fakedata$Y <- factor(fakedata$Y)
test_adaboost <- adaboost(Y~X, fakedata, 10)
pred <- predict( test_adaboost,newdata=fakedata)
print(pred$error)

Run the code above in your browser using DataLab