xgb_train is a wrapper for xgboost tree-based models where all of the
model arguments are in the main function.
xgb_train(
x,
y,
max_depth = 6,
nrounds = 15,
eta = 0.3,
colsample_bytree = 1,
min_child_weight = 1,
gamma = 0,
subsample = 1,
validation = 0,
early_stop = NULL,
...
)A data frame or matrix of predictors
A vector (factor or numeric) or matrix (numeric) of outcome data.
An integer for the maximum depth of the tree.
An integer for the number of boosting iterations.
A numeric value between zero and one to control the learning rate.
Subsampling proportion of columns.
A numeric value for the minimum sum of instance weights needed in a child to continue to split.
A number for the minimum loss reduction required to make a further partition on a leaf node of the tree
Subsampling proportion of rows.
A positive number. If on [0, 1) the value, validation
is a random proportion of data in x and y that are used for performance
assessment and potential early stopping. If 1 or greater, it is the number
of training set samples use for these purposes.
An integer or NULL. If not NULL, it is the number of
training iterations without improvement before stopping. If validation is
used, performance is base on the validation set; otherwise the training set
is used.
Other options to pass to xgb.train.
A fitted xgboost object.