Learn R Programming

Laurae (version 0.0.0.9001)

loss_LKL_math: Laurae's Kullback-Leibler Error (math function)

Description

This function computes the Laurae's Kullback-Leibler Error loss per value provided x, y (preds, labels) values.

Usage

loss_LKL_math(x, y)

Arguments

x
The predictions.
y
The label.

Value

The Laurae's Kullback-Leibler Error per value.

Details

This loss function is strictly positive, therefore defined in \]0, +Inf\[. It penalizes lower values more heavily, and as such is a good fit for typical problems requiring fine tuning when undercommitting on the predictions. Compared to Laurae's Poisson loss function, Laurae's Kullback-Leibler loss has much higher loss. This loss function is experimental. Loss Formula : \((y_true - y_pred) * log(y_true / y_pred)\) Gradient Formula : \(-((y_true - y_pred)/y_pred + log(y_true) - log(y_pred))\) Hessian Formula : \(((y_true - y_pred)/y_pred + 2)/y_pred\)

Examples

Run this code
## Not run: ------------------------------------
# SymbolicLoss(fc = loss_LKL_math, fc_ref = loss_MSE_math, xmin = 1, xmax = 100, y = rep(30, 21))
# SymbolicLoss(fc = loss_LKL_math, fc_ref = loss_Poisson_math, xmin = 1, xmax = 100, y = rep(30, 21))
## ---------------------------------------------

Run the code above in your browser using DataLab