Learn R Programming

Laurae (version 0.0.0.9001)

LauraeML_lgbreg: Laurae's Machine Learning (LightGBM regression helper function)

Description

This function is a demonstration function for using LightGBM regression in LauraeML without premade folds. It has alpha, lambda, and lambda_bias as tunable hyperparameters. It also accepts feature selection, and performs full logging (every part is commented in the source) with writing to an external file in order to follow the hyperparameters and feature count.

Usage

LauraeML_lgbreg(x, y, mobile, parallelized, maximize, logging, data, label,
  folds)

Arguments

x
Type: vector (numeric). The hyperparameters to use.
y
Type: vector (numeric). The features to use, as binary format (0 for not using, 1 for using).
mobile
Type: environment. The environment passed from LauraeML.
parallelized
Type: parallel socket cluster (makeCluster or similar). The parallelized parameter passed from LauraeML (whether to parallelize training per folds or not).
maximize
Type: boolean. The maximize parameter passed from LauraeML (whether to maximize or not the metric).
logging
Type: character. The logging parameter passed from LauraeML (where to store log file).
data
Type: data.table (mandatory). The data features. Comes from LauraeML.
label
Type: vector (numeric). The labels. Comes from LauraeML.
folds
Type: list of numerics. The folds as list. Comes from LauraeML.

Value

The score of the cross-validated xgboost gblinear model, for the provided hyperparameters and features to use.

Examples

Run this code
## Not run: ------------------------------------
# # To add
## ---------------------------------------------

Run the code above in your browser using DataLab