Activations functions can either be used through layer_activation()
, or
through the activation argument supported by all forward layers.
activation_relu(x, alpha = 0, max_value = NULL)activation_elu(x, alpha = 1)
activation_hard_sigmoid(x)
activation_linear(x)
activation_sigmoid(x)
activation_softmax(x)
activation_softplus(x)
activation_softsign(x)
activation_tanh(x)
Tensor
Alpha value
Max value