rrelu function.
activation_rrelu(
x,
lower = 0.125,
upper = 0.333333333333333,
training = NULL,
seed = NULL
)
A `Tensor`. Must be one of the following types: `float16`, `float32`, `float64`.
`float`, lower bound for random alpha.
`float`, upper bound for random alpha.
`bool`, indicating whether the `call` is meant for training or inference.
`int`, this sets the operation-level seed. Returns:
A `Tensor`. Has the same type as `x`.
`x if x > 0 else random(lower, upper) * x` or `x if x > 0 else x * (lower + upper) / 2` depending on whether training is enabled.
Computes rrelu function: `x if x > 0 else random(lower, upper) * x` or `x if x > 0 else x * (lower + upper) / 2` depending on whether training is enabled. See [Empirical Evaluation of Rectified Activations in Convolutional Network](https://arxiv.org/abs/1505.00853).