Learn R Programming

keras (version 2.2.4)

activation_relu: Activation functions

Description

Activations functions can either be used through layer_activation(), or through the activation argument supported by all forward layers.

Usage

activation_relu(x, alpha = 0, max_value = NULL, threshold = 0)

activation_elu(x, alpha = 1)

activation_selu(x)

activation_hard_sigmoid(x)

activation_linear(x)

activation_sigmoid(x)

activation_softmax(x, axis = -1)

activation_softplus(x)

activation_softsign(x)

activation_tanh(x)

activation_exponential(x)

Arguments

x

Tensor

alpha

Alpha value

max_value

Max value

threshold

Threshold value for thresholded activation.

axis

Integer, axis along which the softmax normalization is applied

Value

Tensor with the same shape and dtype as x.

Details

  • activation_selu() to be used together with the initialization "lecun_normal".

  • activation_selu() to be used together with the dropout variant "AlphaDropout".

References