step_relu()
creates a specification of a recipe step that will add the
rectified linear or softplus transformations of a variable to the data set.
step_relu(
recipe,
...,
role = "predictor",
trained = FALSE,
shift = 0,
reverse = FALSE,
smooth = FALSE,
prefix = "right_relu_",
columns = NULL,
skip = FALSE,
id = rand_id("relu")
)
An updated version of recipe
with the new step added to the
sequence of any existing operations.
A recipe object. The step will be added to the sequence of operations for this recipe.
One or more selector functions to choose variables
for this step. See selections()
for more details.
For model terms created by this step, what analysis role should they be assigned? By default, the new columns created by this step from the original variables will be used as predictors in a model.
A logical to indicate if the quantities for preprocessing have been estimated.
A numeric value dictating a translation to apply to the data.
A logical to indicate if the left hinge should be used as opposed to the right hinge.
A logical indicating if the softplus function, a smooth approximation to the rectified linear transformation, should be used.
A prefix for generated column names, defaults to "right_relu_" for right hinge transformation and "left_relu_" for reversed/left hinge transformations.
A character string of the selected variable names. This field
is a placeholder and will be populated once prep()
is used.
A logical. Should the step be skipped when the
recipe is baked by bake()
? While all operations are baked
when prep()
is run, some operations may not be able to be
conducted on new data (e.g. processing the outcome variable(s)).
Care should be taken when using skip = TRUE
as it may affect
the computations for subsequent operations.
A character string that is unique to this step to identify it.
The rectified linear transformation is used in Multivariate Adaptive Regression Splines as a basis function to fit piecewise linear functions to data in a strategy similar to that employed in tree based models. The transformation is a popular choice as an activation function in many neural networks, which could then be seen as a stacked generalization of MARS when making use of ReLu activations. The hinge function also appears in the loss function of Support Vector Machines, where it penalizes residuals only if they are within a certain margin of the decision boundary.
When you tidy()
this step, a tibble is returned with
columns terms
, shift
, reverse
, and id
:
character, the selectors or variables selected
numeric, location of hinge
logical, whether left hinge is used
character, id of this step
The underlying operation does not allow for case weights.
The rectified linear transformation is calculated as
$$max(0, x - c)$$ and is also known as the ReLu or right hinge function.
If reverse
is true, then the transformation is reflected about the
y-axis, like so: $$max(0, c - x)$$ Setting the smooth
option
to true will instead calculate a smooth approximation to ReLu
according to $$ln(1 + e^(x - c)$$ The reverse
argument may
also be applied to this transformation.
Other individual transformation steps:
step_BoxCox()
,
step_YeoJohnson()
,
step_bs()
,
step_harmonic()
,
step_hyperbolic()
,
step_inverse()
,
step_invlogit()
,
step_log()
,
step_logit()
,
step_mutate()
,
step_ns()
,
step_percentile()
,
step_poly()
,
step_sqrt()
data(biomass, package = "modeldata")
biomass_tr <- biomass[biomass$dataset == "Training", ]
biomass_te <- biomass[biomass$dataset == "Testing", ]
rec <- recipe(
HHV ~ carbon + hydrogen + oxygen + nitrogen + sulfur,
data = biomass_tr
)
transformed_te <- rec %>%
step_relu(carbon, shift = 40) %>%
prep(biomass_tr) %>%
bake(biomass_te)
transformed_te
Run the code above in your browser using DataLab