Learn R Programming

⚠️There's a newer version (1.7.8.1) of this package.Take me there.

xgboost (version 1.7.5.1)

Extreme Gradient Boosting

Description

Extreme Gradient Boosting, which is an efficient implementation of the gradient boosting framework from Chen & Guestrin (2016) . This package is its R interface. The package includes efficient linear model solver and tree learning algorithms. The package can automatically do parallel computation on a single machine which could be more than 10 times faster than existing gradient boosting packages. It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that users are also allowed to define their own objectives easily.

Copy Link

Version

Install

install.packages('xgboost')

Monthly Downloads

51,362

Version

1.7.5.1

License

Apache License (== 2.0) | file LICENSE

Issues

Pull Requests

Stars

Forks

Maintainer

Last Published

March 30th, 2023

Functions in xgboost (1.7.5.1)

cb.reset.parameters

Callback closure for resetting the booster's parameters at each iteration.
agaricus.train

Training part from Mushroom Data Set
dimnames.xgb.DMatrix

Handling of column names of xgb.DMatrix
cb.early.stop

Callback closure to activate the early stopping.
cb.cv.predict

Callback closure for returning cross-validation based predictions.
callbacks

Callback closures for booster training.
normalize

Scale feature value to have mean 0, standard deviation 1
agaricus.test

Test part from Mushroom Data Set
print.xgb.DMatrix

Print xgb.DMatrix
getinfo

Get information of an xgb.DMatrix object
xgb.config

Accessors for model parameters as JSON string.
xgb.create.features

Create new features from a previously learned model
setinfo

Set information of an xgb.DMatrix object
predict.xgb.Booster

Predict method for eXtreme Gradient Boosting model
print.xgb.Booster

Print xgb.Booster
prepare.ggplot.shap.data

Combine and melt feature values and SHAP contributions for sample observations.
slice

Get a new DMatrix containing the specified rows of original xgb.DMatrix object
print.xgb.cv.synchronous

Print xgb.cv result
xgb.load

Load xgboost model from binary file
xgb.ggplot.deepness

Plot model trees deepness
xgb.Booster.complete

Restore missing parts of an incomplete xgb.Booster object.
xgb.DMatrix

Construct xgb.DMatrix object
cb.save.model

Callback closure for saving a model file.
xgb.DMatrix.save

Save xgb.DMatrix object to binary file
xgb.load.raw

Load serialised xgboost model from R's raw vector
xgb.ggplot.shap.summary

SHAP contribution dependency summary plot
xgb.ggplot.importance

Plot feature importance as a bar graph
xgb.train

eXtreme Gradient Boosting Training
xgb.plot.tree

Plot a boosted tree model
xgb.attr

Accessors for serializable attributes of a model.
xgb.model.dt.tree

Parse a boosted tree model text dump
xgb.parameters<-

Accessors for model parameters.
xgb.unserialize

Load the instance back from xgb.serialize
xgb.gblinear.history

Extract gblinear coefficients history.
dim.xgb.DMatrix

Dimensions of xgb.DMatrix
xgb.save

Save xgboost model to binary file
xgb.save.raw

Save xgboost model to R's raw vector, user can call xgb.load.raw to load the model back from raw vector
xgb.importance

Importance of features in a model.
xgb.cv

Cross Validation
xgb.dump

Dump an xgboost model in text format.
xgb.serialize

Serialize the booster instance into R's raw vector. The serialization method differs from xgb.save.raw as the latter one saves only the model but not parameters. This serialization format is not stable across different xgboost versions.
xgb.shap.data

Prepare data for SHAP plots. To be used in xgb.plot.shap, xgb.plot.shap.summary, etc. Internal utility function.
xgb.plot.multi.trees

Project all trees on one tree and plot it
xgb.plot.shap

SHAP contribution dependency plots
xgb.set.config, xgb.get.config

Set and get global configuration
xgboost-deprecated

Deprecation notices.
a-compatibility-note-for-saveRDS-save

Do not use saveRDS or save for long-term archival of models. Instead, use xgb.save or xgb.save.raw.
cb.evaluation.log

Callback closure for logging the evaluation history
cb.gblinear.history

Callback closure for collecting the model coefficients history of a gblinear booster during its training.
cb.print.evaluation

Callback closure for printing the result of evaluation