Learn R Programming

keras (version 0.3.5)

activation_relu: Activation functions

Description

Activations functions can either be used through layer_activation(), or through the activation argument supported by all forward layers.

Usage

activation_relu(x, alpha = 0, max_value = NULL)

activation_elu(x, alpha = 1)

activation_hard_sigmoid(x)

activation_linear(x)

activation_sigmoid(x)

activation_softmax(x)

activation_softplus(x)

activation_softsign(x)

activation_tanh(x)

Arguments

x

Tensor

alpha

Alpha value

max_value

Max value