Learn R Programming

bst (version 0.3-24)

mada: Multi-class AdaBoost

Description

One-vs-all multi-class AdaBoost

Usage

mada(xtr, ytr, xte=NULL, yte=NULL, mstop=50, nu=0.1, interaction.depth=1)

Value

A list contains variable selected xselect and training and testing error err.tr, err.te.

Arguments

xtr

training data matrix containing the predictor variables in the model.

ytr

training vector of responses. ytr must be integers from 1 to C, for C class problem.

xte

test data matrix containing the predictor variables in the model.

yte

test vector of responses. yte must be integers from 1 to C, for C class problem.

mstop

number of boosting iteration.

nu

a small number (between 0 and 1) defining the step size or shrinkage parameter.

interaction.depth

used in gbm to specify the depth of trees.

Author

Zhu Wang

Details

For a C-class problem (C > 2), each class is separately compared against all other classes with AdaBoost, and C functions are estimated to represent confidence for each class. The classification rule is to assign the class with the largest estimate.

See Also

cv.mada for cross-validated stopping iteration.

Examples

Run this code
data(iris)
mada(xtr=iris[,-5], ytr=iris[,5])

Run the code above in your browser using DataLab