loss_LKL_xgb: Laurae's Kullback-Leibler Error (xgboost function)
Description
This function computes for xgboost's obj
function the Laurae's Kullback-Leibler Error loss gradient and hessian per value provided preds
and dtrain
.Usage
loss_LKL_xgb(preds, dtrain)
Value
The gradient and the hessian of the Laurae's Kullback-Leibler Error per value in a list.Details
This loss function is strictly positive, therefore defined in \]0, +Inf\[
. It penalizes lower values more heavily, and as such is a good fit for typical problems requiring fine tuning when undercommitting on the predictions. Compared to Laurae's Poisson loss function, Laurae's Kullback-Leibler loss has much higher loss. Negative and null values are set to 1e-15
. This loss function is experimental. Loss Formula : \((y_true - y_pred) * log(y_true / y_pred)\) Gradient Formula : \(-((y_true - y_pred)/y_pred + log(y_true) - log(y_pred))\) Hessian Formula : \(((y_true - y_pred)/y_pred + 2)/y_pred\)