Learn R Programming

mnj (version 1.0)

flex: FlexBoost

Description

A Flexible Boosting Algorithm With Adaptive Loss Functions

Usage

flex(X, y, n_rounds, interval, width, type, control = rpart.control(cp =
  -1, maxdepth = 1))

Arguments

X

Variable of train data

y

Label of train data

n_rounds

How many trees gonna make

interval

Parameter to change Exp Loss-Function

width

Searching area (more than 1)

type

Tie evaluation option (1 or 2, recommed 2)

control

fix cp = -1, maxdepth = 1 based on AdaBoost

Value

Returns decision tree informations (e.g. Split criteria, Weight of weak classifier, Train accuracy)

Details

This is a main algorithm of FlexBoost: like other Boosting packages, it returns compatible information. In order to prevent unexpected errors, missing data should not be allowed in input data. Return value is composed of four major parts (e.g. terms, trees, alphas, acc). terms : Input variable information trees : Decision tree information alphas : Weight of weak classifier acc : Train accuracy of each iteration

Examples

Run this code
# NOT RUN {
data <- read.csv(url("http://bit.ly/flex_iris"), TRUE)
flex(data[,1:2], data[,6], 10, 0.1, 3, 2)
# }

Run the code above in your browser using DataLab